Being a SENCO or Access arrangement assessor, you've got a lot on your plate. Juggling student support and those JCQ rules? It can feel like a lot, but you're so important for making sure every student gets what they need.
Add assessments into the mix, and even seasoned professionals can hit a few snags.The good news? Once you know what to look for, these pitfalls are easy to spot—and even easier to avoid.
In this guide, we’ll go over the most common access arrangement assessment mistakes we see in schools and give you some easy, real-world tips to steer clear of them. Whether you’re just starting with assessments or seeking a quick review, this straightforward guide will help you remain assured and compliant.
1. Misunderstanding JCQ Criteria for Access Arrangements
What's the Issue?
It’s surprisingly easy to get caught out by changes to the JCQ rules—especially if you’re going by memory or last year’s notes. And if you miss an update, that can mean the evidence you submit doesn’t meet current standards, or worse, you end up recommending an access arrangement that no longer qualifies. Even a small oversight can have a big impact on your student’s support.
Why It Happens
JCQ guidelines are updated every academic year. While some changes are minor, others significantly affect how access arrangements are assessed and approved, like adjustments to the use of computer readers, or shifts in the threshold for extra time. The problem is, these updates don’t always jump out, and it’s easy to miss them if you’re not looking for them.
On top of that, many SENCOs juggle assessment with a dozen other responsibilities. When your time is stretched thin, sitting down to read through updated documentation can easily slip down the to-do list. And if you’re using materials or templates passed on from previous years or from another colleague they may not reflect the most current expectations, even if they look correct.
How to Avoid It
- Review JCQ documentation annually, ideally at the start of each academic year. Highlight key JCQ changes and Updates for quick reference.
- Attend CPD or refresher training, either independently or with your SEN team. Education Elephant run free webinars every year with JCQ updates.
- Create a live checklist or summary sheet of the criteria for each type of access arrangement (e.g., extra time, reader, scribe).
- Discuss cases with colleagues—what qualifies and doesn’t isn’t always black and white.
Clarity is your friend when it comes to JCQ. Keeping current ensures every decision you make stands up to scrutiny.
Watch JCQ Changes and Updates (202/25) Webinar
2. Using Outdated or Invalid Assessments
What's the Issue?
Relying on assessments that are no longer current or don’t meet standardisation criteria can invalidate an application. It may also result in missed opportunities to support a student appropriately.
Why It Happens
It’s common for assessors to inherit assessment tools from previous staff members or continue using familiar materials they’ve grown comfortable with over the years. However, if those tools are very old (even if they still seem accurate) and/or a newer version is available, they may not be suitable for use.
Another frequent issue is using assessments in the wrong JCQ category. For example, applying a test meant for measuring reading accuracy as evidence for processing speed can lead to compliance issues and malpractice. This can happen if you are unfamiliar the JCQ regulations or when tools are selected without revisiting the test manual or cross-checking JCQ’s categorisation.
Even highly experienced assessors can fall into this trap, especially when trying to work efficiently or under pressure. The risk is that, during inspection, the use of an incorrect or outdated tool could raise concerns—not just about the evidence submitted, but about the whole process used to reach the recommendation.
JCQ compliance isn’t just about having data—it’s about using the right data, in the right way, at the right time.
How to Avoid It
- Ensure that the assessments you’re using are current and relevant. You want to be sure they’re nationally normed, based on a good, representative sample of students and the most reliable and up to date version of the test (e.g. if a newer edition of a test is out then you must use this version).
- Cross-reference your toolkit with JCQ guidelines and keep a log of each tool’s function. For example reading fluency tests measure speed and can be used as a test of extra time; reading comprehension tests can be used to support readers (but not extra time).
- Look for publisher updates—test publishers often provide summaries of changes and whether a tool meets JCQ standards.
- Switch to digital assessments like SWIFT – With automatic updates and instant, JCQ-aligned reports, you’ll save time and ensure accuracy in your work.
The right tools don’t just meet compliance—they give you confidence in your decisions.
3. Inadequate Evidence of Normal Way of Working
What's the Issue?
Having test scores which meet the JCQ access arrangement criteria is only part of the picture. If you can’t demonstrate that the recommended arrangement is already in place as part of the student’s everyday classroom experience, the application may be declined. JCQ is very clear: access arrangements must reflect both a genuine need and the student’s normal way of working. In other words, support in exams should mirror support in the classroom—not just respond to a score on paper.
Why It Happens
It’s understandable to focus heavily on assessment data—especially when the results clearly indicate difficulty, such as slow processing speed or weak reading accuracy. Those numbers feel like solid evidence. But without supporting documentation that shows this need is being addressed during lessons, the application won’t meet JCQ standards.
This often happens because the ‘normal way of working’ evidence is less structured or formalised. Class teachers may be offering informal support—like extra time, quiet working spaces, or a reader—but it’s not always recorded. And in a busy school environment, tracking and logging every instance of support can feel like just another task on a long list.
Sometimes, assessors also assume that the presence of a score implies support is being given, or that a recommendation based on good practice will be enough on its own. Unfortunately, that’s not how JCQ sees it. They want a full, joined-up picture: the need, the support in action, and a clear link between the two.
The most effective applications are built on two pillars: solid assessment data and consistent classroom practice. You need both.
How to Avoid It
Begin a structured approach to colecting evidence and do it as early as possible in a students academic career and identify students that require support as early as possible (e.g. screening tests show up low scores on spelling tests.)
- Keep a working folder with ongoing work samples produced under normal conditions and with support in place.
- Log support strategies are used in class, particularly for students receiving extra time or a reader.
- Collect teacher comments using a straightforward form (e.g. tick boxes) or digital survey, ensuring that you document when and how support is provided.
- Observe students in class where possible—are they consistently using the arrangement you’re recommending?
Assessment and provision should always walk hand-in-hand.
Download a Free Access Arrangement Starter Kit!
(Include JCQ Access Arrangements criteria summary, Internal access arrangements checklist for schools, Normal way of working: JCQ guide & teacher template and more)
4. Confusing Standard Scores and Percentile Ranks
What's The Issue?
Mixing up standard scores, percentiles, or confidence intervals can lead to incorrect recommendations—and potentially weaken your access arrangements application. Even small misinterpretations can lead to incorrect conclusions about need.
Why It Happens
Assessment data can be tricky to navigate—especially when you’re working across multiple tools, each with their own formats and reporting styles. Some tests present results as standard scores, others as percentiles, and some may include age-equivalents or scaled scores. If you’re switching between these formats frequently, it’s easy to get your wires crossed.
On top of that, score thresholds can vary subtly from one assessment to another. For example, a standard score of 84 may be considered “below average” on one test, while another uses slightly different cutoff points for diagnostic interpretation. Without a clear understanding of the scoring system and how JCQ expects those scores to be presented, even experienced assessors can make accidental errors in reporting or justification.
Time pressure also plays a role. When you’re balancing a full workload, it’s tempting to input scores quickly without double-checking conversions or referencing the test manual. But a misplaced percentile—or confusing the difference between a standard score of 85 and 95—can change the perceived level of need, and that has consequences for both the student and the validity of the recommendation
Getting the numbers right isn’t just about accuracy—it’s about building trust in your decisions.
How to Avoid It
- Refresh your understanding regularly, especially when new tests are introduced to your toolkit.
- Use score conversion charts to double-check percentiles and standard score equivalents.
- Ensure consistency in how you present results across reports and applications.
- Let digital tools do the work—platforms like SWIFT automatically generate JCQ-aligned data with minimal room for error.
When you’re clear and accurate in how you explain the scores, it really strengthens your case. Plus, it shows you really know your stuff.
5. Over-Assessing or Under-Assessing
What's the Issue?
Providing too many test results can overwhelm your application and obscure the key message—making it harder for reviewers to understand the student’s specific need. Additionally throwing a whole book of tests at a student or repeating tests with a student in the hope of achieving one standard score below 84 is not terribly ethical either. On the other hand, submitting too few results can create gaps in the evidence and perhaps leave questions about whether the recommended access arrangement is fully justified (e.g. two scores are required for extra time). Similarly failing to conduct enough tests with a student may result in them being denied an access arrangement because you have failed to capture their difficulty with the right test (e.g. only assessing reading fluency and not working memory). Either way, the application loses clarity and the student may lose out.
Why It Happens
This often comes down to balancing time, confidence, and interpretation. Some assessors worry that a single test result may not be convincing enough, so they overcompensate by running a full battery of assessments—hoping that more data will strengthen the case. But instead, this can create conflicting results or distract from the most relevant findings. Other times assessors feel under pressure when a test result does not meet the criteria so they re-test in the same area in the hope that this time they will receive a low score.
Conversely, other assessors prefer a more streamlined approach, particularly when under pressure. They may conduct the bare minimum of testing to save time or rely heavily on one piece of data without broader context. While this may feel efficient, it can leave assessors open to scrutiny if the application lacks sufficient evidence across required areas—like processing speed, reading fluency, or comprehension. Failure to test students in pertinent areas (e.g. processing speed and working memory) may also lead to difficulties not being captured and students being ineligible.
There’s also the influence of habits. Some schools or individuals have “go-to” assessment routines that may not always reflect best practice or the specific needs of each student. Without pausing to reflect on why each test is being chosen, it’s easy to default to a one-size-fits-all approach—which you must be very cautious of.
A well-judged, focused assessment tells a clear story—and that’s what makes it powerful.
How to Avoid It
- Map your assessment battery to JCQ requirements: If applying for extra time, ensure that the processing speed is covered. If there are gaps, cover them, e.g. if no test for reading fluency, then maybe it’s time to invest in one.
- Don’t test for the sake of it—choose assessments that reflect the specific difficulty the student is experiencing.
- Provide a focused picture, using 2–3 targeted assessments rather than a full battery unless clearly necessary.
- Add narrative interpretation to help the data speak—what do these scores mean for this student?
Honestly, sometimes just a couple of really good tests can tell you way more than a whole pile of numbers!
6. Writing a Vague or Incomplete Application
What's the Issue?
Even if the assessment itself is accurate, a report that’s confusing, overly technical, or missing key information can derail an otherwise solid application. JCQ requires that reports provide clear evidence of need, align with current criteria, and offer a well-reasoned recommendation that reflects both the student’s profile and classroom practice. If any of these elements are unclear or absent, the application may be delayed—or rejected altogether.
Why It Happens
Writing assessment reports takes time and mental energy, especially when juggling a busy school day. Under pressure, it’s tempting to copy and paste from previous reports or lean heavily on pre-filled templates. While templates can be helpful for structure, if they’re not carefully tailored to the individual student, the result may feel generic or disconnected from the student’s actual needs.
Another common issue is the overuse of technical language. As trained assessors, it’s easy to slip into professional shorthand or academic terminology—especially when interpreting complex data. But reports are often read by SENCOs, teachers, parents, and JCQ inspectors, many of whom may not have the same level of familiarity with assessment terms. If a reader can’t quickly understand the connection between the test results and the recommended support, the report risks losing its effectiveness.
Omissions are also a risk, particularly when reports are completed in a rush. Leaving out contextual details—such as the student’s normal way of working, classroom strategies already in place, or a clear justification for the chosen arrangement—can weaken the entire application. Even a strong set of scores can’t speak for itself without a well-explained narrative behind it.
A good report doesn’t just meet requirements—it brings clarity, insight, and professional confidence to your recommendation.
How to Avoid It
- Use a consistent structure: summary, scores, interpretation, evidence of need, recommendation.
- Avoid jargon—write for your audience, which may include a parent, teacher, or JCQ inspector.
- Check your links: make sure the score justifies the need, and the need aligns with what’s in place already.
- Proof read before submitting, especially if the report will be used for access arrangements or shared externally.
What you want is a report that really tells the student’s story, plain and simple, and in a way that grabs attention.
7. Forgetting the Student Voice
What's the Issue?
Leaving out the student’s perspective can make an otherwise well-supported application feel incomplete. JCQ places clear emphasis on ensuring that access arrangements are not only based on evidence of need and classroom practice—but also reflect the student’s lived experience. This is especially important for arrangements like readers, scribes, or prompts, where student comfort and familiarity with the support can influence its effectiveness in exam settings.
When the student voice is missing, applications may raise questions during inspection about whether the arrangement is truly necessary, understood, and accepted by the learner. Including it strengthens the case and shows that the student is an active participant in their own learning journey.
Why It Happens
In the day-to-day demands of school life, collecting student voice can feel like an extra step that’s easy to postpone. With paperwork mounting and deadlines looming, assessors may prioritise formal test results and teacher feedback over direct input from the student.
In some cases, assessors assume they already understand the student’s experience based on classroom observations or conversations with staff. But JCQ is clear: student input isn’t optional—it’s a valued part of the process. Not including this aspect weakens the application and misses a chance to empower learners. It is important to give them control over their support.
Sometimes, too, it’s about confidence—some students find it difficult to express their needs or may not feel comfortable articulating what helps them. But even a short, guided conversation or a simple questionnaire can surface valuable insights that deepen understanding and humanise the report.
When we listen to students, we build trust—and when they feel heard, the support we provide becomes more meaningful.
How to Avoid It
- Conduct a short interview or use a simple form to gather insights into how the student feels about learning and support.
- Include a direct quote in your report—e.g. ” get anxious when I have to read aloud” or “Extra time helps me plan properly.”
- Observe behaviour patterns, especially for students who struggle to articulate their needs verbally.
- Revisit student voice regularly, especially if the arrangement changes or new concerns emerge.
Remember, students know themselves best. So, make sure you get their input, their voice, into the whole process.
8. Rushing the Access Arrangement Process
What's the Issue?
Assessments that are carried out at the last minute often lack the depth and documentation needed to support a strong access arrangements application. When time is tight, it becomes much easier to miss key details—whether that’s forgetting to collect classroom evidence, skipping a crucial test, or rushing through interpretation. These rushed assessments may result in reports that feel vague or incomplete, and ultimately, in decisions that don’t fully reflect the student’s needs.
Why It Happens
In many schools, requests for access arrangements peak in Year 10, just as students begin preparing for a formal exam. It’s not uncommon for SENCOs to receive a flurry of late referrals from subject teachers who suddenly realise a student might need support—or from students themselves who are struggling under exam pressure.
This last-minute surge places a huge demand on assessors, who may already be balancing teaching, team leadership, and other responsibilities. As a result, assessments are squeezed into small windows of time, often with little opportunity for classroom observation or follow-up.
There may also be systemic issues—such as a lack of early screening procedures, limited staff availability for support, or a reactive rather than proactive approach to identifying need. In some cases, schools delay assessments out of caution, unsure whether a student’s needs are long-term or temporary, only to find themselves up against a tight deadline when exams draw near.
And of course, we’re all human. Sometimes it simply comes down to juggling too much at once. When assessors are stretched, something has to give—and unfortunately, that something is often time.
Good assessment takes preparation. Building in time for reflection, observation, and follow-up gives every student a stronger foundation for support.
How to Avoid It
- Screen earlier—ideally in Year 7 (!) and maybe again in year 9, giving you plenty of time to observe, collect evidence, and trial arrangements.
- Block time for assessments in your calendar at regular intervals.
- Use a tracking system to log students who might need testing later, so you’re never starting from zero.
- Communicate deadlines to staff so referrals come with enough lead time.
Good assessments take time—build that time into your year plan.
9. Working in Isolation
What's the Issue?
Without peer support, it’s easy to lose confidence or miss opportunities for professional growth. Working alone can make it harder to benchmark your practices or stay current with evolving expectations.
Why It Happens
Many schools only have one qualified assessor or SENCO, which naturally leads to a more solitary working environment. In smaller or rural schools, you may be the only person with assessment training, leaving little opportunity to share ideas or review decisions with someone who understands the JCQ framework in depth.
Time is another major factor. Even when there are other trained staff in your wider network or multi-academy trust, finding time for meaningful discussion can feel like a luxury. Between assessments, EHCP paperwork, team meetings, and lesson planning, collaboration often ends up on the back burner.
There’s also the nature of the role itself—assessors are often seen as “specialists” working quietly in the background. Because assessment work happens outside the classroom and behind closed doors, it’s easy to feel a little disconnected from the wider SEN team or school staff.
Lastly, confidence plays a role. Some assessors—especially those new to the role or recently qualified—may hesitate to reach out for input, unsure if they’re “doing it right” or worried about asking questions that might seem basic.
But the truth is, everyone benefits from a second set of eyes, a shared experience, or a sounding board—no matter how experienced they are.
How to Avoid It
- Join a professional network—online communities, SENCO forums, or Education Elephant’s user group (Facebook & member community).
- Connect with local schools to set up informal peer review or support circles.
- Share dilemmas or questions during CPD events—chances are, others are facing the same challenges.
- Use external trainers or assessors for tricky cases or second opinions.
It can feel like you’re the only assessor sometimes, but honestly, you’re definitely not alone in this.
10. Neglecting Your Own CPD
What's the Issue?
Skipping professional development might feel harmless in the short term, but over time it can leave you feeling unsure, out of touch, or underprepared—particularly as JCQ regulations evolve, new assessment tools emerge, and best practices shift. Without regular CPD, it’s harder to stay sharp, confident, and aligned with current standards. And when you’re not fully up to date, that uncertainty can creep into your assessments and reporting.
Why It Happens
It’s no secret that SENCOs and assessors are among the busiest people in a school. With referrals to manage, reports to write, provision to coordinate, and deadlines constantly approaching, carving out time for your own development often feels like a luxury rather than a necessity. CPD slides down the to-do list—not because it isn’t valued, but because the urgent always seems to win out over the important.
Sometimes there’s also a feeling of “I already know this”—especially for experienced assessors who’ve been doing the job for years. But with JCQ making annual updates, and new tools like digital assessments becoming more widely used, even seasoned professionals benefit from a regular refresh.
In smaller schools or teams, assessors may also feel isolated, with fewer opportunities to attend training or connect with peers. And in some cases, the CPD available doesn’t feel relevant, practical, or easily accessible—so it gets skipped entirely.
But high-quality, well-targeted CPD doesn’t just boost your knowledge. It reinforces your decision-making, helps you feel more confident in your reports, and ultimately leads to better outcomes for students.
Investing in your own learning is one of the most powerful ways to support the learning of others.
How to Avoid
- Schedule CPD into your year plan, the same way you would a staff meeting or parent evening.
- Attend short, focused sessions online or in-person—it doesn’t always need to be a full course.
- Look for qualifications like the ETAAC Level 7, which help you deepen your assessment knowledge and and qualify you as an access arrangement assessor
- Stay informed with blogs, newsletters, and industry updates.
When you keep learning and growing professionally, your students are the ones who really benefit in the end.
Final Thoughts: Support Starts With You
You’re already doing a ton to make sure students are heard and supported. Getting assessments right is a big part of that, and honestly, it doesn’t have to feel like a mountain to climb.With the right tools, a solid process, and up-to-date knowledge, you can assess with clarity, confidence, and compassion.
We’re here to help—explore our CPD programmes, browse our assessment tools, or sign up to our newsletter for practical tips straight to your inbox. Let’s keep making a difference—one student, one assessment at a time.