News and Resources
A collection of insights from SIMON and strategic partners
Please select from one our interesting blog posts from the Table of Contents on the left side of this page
Table of Contents
WORKFLOW EFFICIENCIES WARNING: “Never underestimate the importance of workflow efficiencies in education”.

A research study of 135 schools into their effectiveness of using learning and administration technologies.
The Secret
Every day at SIMON, our team works with the dedicated focus on delivering the best technology, service, and support for our schools in learning management and school administration. Sounds easy right? Well, let me fill you in on a little secret. It’s really, really hard.
Every school is unique. Every teacher is different. Every administrator has varied priorities and every parent wants everything delivered “immediately”. Oh, please bring back the days of when a technology fix usually involved “turning your computer off and then on”. Today, schools are more complex than any similar-sized organisation. Whilst many businesses may standardise on a handful of core applications, schools need to deliver an engaging learning experience aligned to the respective age group, subject, location, curriculum standard, individual skill set, level of collaboration, assessment type, individualised learning approach and all mandatory administration and reporting requirements – all this for just a teacher. Imagine extending this to the other common roles in a school such as a Principal, Department Head, Year Level Coordinator, IT Director, Finance, Enrolments and Wellbeing. I think you now see my point.
The Challenge
The SIMON learning management software solution is often the cornerstone of a school's system of applications, addressing many of the core business needs and activities, from parent communication, reporting and assessment to behavioural tracking.
It is also true that SIMON is but one of many applications in an interrelated system of applications which will have grown and evolved over time with changing needs and innovation. New applications are added, new ways of working emerge, and additional complexity is introduced into the system. This cycle manifests additional workflows, processes, costs and stress for teachers and administrators.
SIMON needed to examine the ways in which people conduct their work within and across applications, exposing the difference between design and reality. When done we examined the ways in which:
- SIMON interacts with other applications (or doesn't) at critical junctures in common workflows.
- Double-handling and highly manual, repetitive work
- Skills gaps and the impact on learning & teaching practice.
In total, 81 Secondary, 45 Primary, 2,471 users and 4,043 workflows were reviewed.
Discovering “Spaghetti Duplication”
The following chart visually represents the volume of workflows captured from schools which only use 32% of SIMON functionality supporting work types encompassing the major areas of:
- Classroom teaching
- Curriculum Development
- Parent and Student Communication
- Student Administration
- Assessment
- Behavioural Reporting
- Reporting
- Attendance
- Staff Information
- Finance
Each column represents an element of each workflow: role type, data type/activity, application being used to initiate the workflow, whether the data needed to be duplicated in another application and reason for duplication.
Chart 1: Duplication of workflows with schools using only 32% of SIMON features
Initial observation of the chart shows an incredibly high level of duplication of workflows against many different applications.
Upon further analysis of both survey data and qualitative interview data, it was wonderful to see that SIMON scored an average of 4.25 out of 5 against the features of:
- Assessment (rubrics, marking, feedback, assigning assessment tasks)
- Attendance (class rolls, arrivals/departures, absence notification
- Behavioural Management (students notes, permissions, pastoral incidents)
Now, you may be thinking SIMON is very self-indulgent to be bragging about the above data insight – well you are right. We are very proud of this, but as you will shortly read, SIMON has a bit of work to do in some other key areas.
Another key learning from this chart is that there is an invisible negative pattern bubbling beneath the surface influenced by this level of duplication – stress. We have observed that the higher the level of duplication, the higher the level of stress amongst teachers and staff.
From all workflow analyses, we identified that schools using only 32% of SIMON features experienced repeated workflows for 47% of related activity in other applications to get a desired outcome.
The schools that used 64% of SIMON features experience something far different.
Chart 2: Duplication of workflows with schools using 64% of SIMON features
The above chart highlights the workflow efficiencies when using a higher number of features within SIMON. We also recognised the decreased levels of frustration and stress.
The most common applications being used in schools are Google Classroom, MS Teams, Google Drive, MS OneDrive, Operoo, SeeSaw, Class Dojo, and the Student Management Systems. When we delve further into the “functions and purpose” of these applications, we start to gain a clearer picture where schools should focus their efforts in alleviating their duplication and stress pain.
Light at the End of the Tunnel
SIMON’s Systems Analysis has examined the ways in which schools conduct their work within and across applications, ascertaining workflows that bring efficacy in staff workloads, and those that decrease efficiency and impact on daily activity.
Thomas Carr College
"We used to use four separate software packages, now with SIMON, the College gets all the features they need with one".
Light at the End of the Tunnel
SIMON’s Systems Analysis has examined the ways in which schools conduct their work within and across applications, ascertaining workflows that bring efficacy in staff workloads, and those that decrease efficiency and impact on daily activity.
This analysis has confirmed assumptions on whether there is an ideal single “source of truth” application that will service all needs from a workflow activity perspective – this answer is no. However, what SIMON does provide is an application that, when utilisation is done right, offers a centralised learning management system that will create, curate and disseminate information alongside other systems to achieve and sustain operational efficiency.
The opportunities for schools to reduce workflow duplication and stress can be prioritised according to the following categories:
- Document Management
- Integrations: Google, Microsoft
- Parent Communications
- Push Notifications
- Mobile Applications
- NCCD and Behavioural Portals
- Training of Staff
- Learner Analytics and Insights being made available to students
Change Management

Educators and administrators have recently experienced significant pressures influenced by the pandemic and the resulting remote learning. Taking the initiative to reduce workflow duplication should be prioritised in a way which is aligned to your school’s culture, project experience and staff capacity.
We are certain that whilst SIMON is an excellent platform to engage your community, the fact is that a technology platform is only as good as your school’s ability to successfully engage with their users. Our recommendation is to take a moment’s pause to assess and discuss your current situation and discover not only what innovations exist in the education technology sector, but also internally reflect on your internal workflows and community wellbeing.
About the Author
Danny Gruber is the Business and Partner Manager at SIMON – a leading not-for-profit learning management and administration platform supporting over 250 schools across Australia. He has been working with schools for over 12 years and has conducted over 200 Strategic ICT Reviews of schools into how they plan, leverage, and use technology.
If you would like to learn more about SIMON or have a confidential discussion about your current school situation, please contact SIMON at: info@simonschools.net
DATA ANALYTICS How teachers can use data analytics to improve instruction.

With the new school year fast approaching, teachers will soon be turning their attention to planning. With densely packed curriculums, full classes and the prospect of snap school closures, there's a heap to consider... but when it comes to school analytics, where to start? As an English teacher, turned Analytics Guy, I want to give you my five approaches for using data analytics while you review your instructional material.
The suggestions that follow are ordered by the time of year you might use them, beginning with the commencement of the school year. Like a funnel, the first suggestions look at data that relays information about long spans of time and we end with data that's useful for shorter-term decision-making.
Over the summer break
Getting to know your classes
Alongside any transition information you've received about your incoming students with particular needs, I'd want a sense of strengths and challenges in the class so that my initial instructional materials are best suited to meet them where they are.
Starting with NAPLAN and then moving onto ACER or Allwell (depending on whether your school runs these tests) I'd want to understand my class's levels of Literacy and Numeracy. How does the class compare to their year level cohort? How does their rate of growth against each Domain within Literacy and Numeracy compare to the State's rate of growth?
Primary school teachers may take this a step further and look at the same metrics for specific strand information - Reading is a good example of this. What NAPLAN tells us about a student's proficiency in reading will be a useful backdrop for the more real-time information collected in assessments such as Fountas & Pinnell. Having an easily accessed, longitudinally tracked reading record, particularly when compared to other results (pictured below) provides great insight into points of growth and challenge for your students.
For English and Maths teachers, I'd go a step further and examine NAPLAN and your PAT tests for the questions in each strand where students typically come unstuck. A common approach used for this kind of analysis is the Guttman chart and the analysis of each student's zone of proximal development. Knowing the exact NAPLAN and PAT questions where your students began to struggle can give you a crystal clear indication of their capability.
These two activities should help you answer questions on whether your first unit of work must start with greater focus on definitions or include a revision of materials underpinning the curriculum you need to cover.
My subject
Still in pre-school-year, I'd want to know who of my new students likes my subject (and potentially doesn't...), who has excelled and who may need support. Seeing my class's results for my Learning Area over time will certainly be helpful but I'd also want a clear picture on:
- Who has required modified tasks and how were they modified?
- To what extent have the curriculum strands that I'll cover this semester been scaffolded, or already covered? What can an analysis of performance against the curriculum strands tell me about where to start my instructional materials and what can be skipped or simply revised?
During the school year
Term holiday planning
Many schools will supplement NAPLAN assessments with a range of PAT tests or the Allwell test. NAPLAN results are less useful if you're teaching Grade 4 or 6 or you've got Year 8s or 10s, these results are now dated so results from PAT or Allwell tests become a useful counterpoint. These external assessments measure aptitude rather than performance against a curriculum, so there's no likely to be a need to radically rethink what you've got planned for next Term. They do, however, provide further evidence of whether a student's capability and their performance line up.
Task work: formative assessment
In my experience, this is the data that is most critical to informing changes we need to make to our instructional material. Formative assessment is second-nature to K-12 teachers, so acknowledging the considerable time and expertise that goes into this, I'll stay in my lane and restrict comments to analytics!
Your Learning Management System and external content providers such as Edrolo, Education Perfect and Essential Assessment have the capability to assess students against the curriculum at points of your choosing. Objectively assessed tasks, i.e. quizzes or multiple choice questions, which do not require teacher judgement (and thereby do not require assessment time) can be a litmus test for the extent to which a class is:
- engaging with the curriculum content.
- understanding the content and responding in a way that is appropriate to the task.
Seeing all of this data in one place as it transpires in real-time, particularly as students are preparing for a major assessment provides a snapshot of what material has landed well and what needs redressing.
Seen in retrospect, a comparison of formative assessment results to summative assessment results will also help you determine which of the preparatory tasks you'd set should be kept for the next time the unit of work is done.
Evaluating impact
The final suggestion I have concerns my area of passion - using technology to make the connections between our assessment practices and the curriculum frameworks which underpin our instructional materials explicit. Here, technology has a practical role to play in that the individual criterion in your assessment rubrics can be linked to curriculum codes. Whether your school follows the Victorian Curriculum, the NSW Syllabus or the National Curriculum, having rubric criteria linked in this way allows for the analysis of student performance and growth by Learning Area Strand.
What I like about this is that summative assessment when done in this way can be both a measure of performance and a measure of progress against the curriculum. Beyond the performance grade, this approach has the potential to:
- provide both teacher and student a clear indication of skill growth over time.
- provide equal importance to evaluating teaching's impact on progression as to a student's performance against the objectives.
In concluding I want to point out a couple of (probably obvious) things. This list is far from exhaustive is the first. The second is that while all of this analysis is possible without data analytics software, it is probably out of reach without a solution like Albitros. We automate the collection, calculations and visualisations required to conduct the activities described above.
***
This blog post has been reproduced with permission from SIMON's strategic analytics partner intellischool.co
The author of this insightful blog is Rowan Freeman, Intellischool's Product Manager.
Cover photo by Jess Bailey on Unsplash.
SIMON ANALYTICS Applying the insights from standardised tests to your lesson planning

Being able to capture and analyse the data from standardised tests such as NAPLAN, ACER and Allwell has become, in effect, table stakes for all analytics companies working with K-12 schools. Getting the data and pulling it together is easy enough and visualising it in conditional categories which breaks down performance in terms of meeting or exceeding expectation is also fairly straightforward. This all reflects what schools would have done manually. Analytics software that automates this process saves a school a lot of time in manual effort, but in stopping there, analytics fails to truly address what teachers want to know next.
Working with our schools and having them really push us and what we're doing with NAPLAN, ACER and Allwell data to better underpin their lesson planning has been instrumental. So, cutting to the chase... what should we be analysing?
Aptitude vs Achievement
Standardised tests provide us with an indication of a student's capability at a point in time, measuring the skills they need to succeed as students and in life more generally. Teacher judgements and measures of achievement are an assessment of a student's ability to respond critically to a body of content that has been studied. The two are different but complementary in that can help us understand if a student is meeting their potential.
The figure below illustrates the Domain achievement and growth results for one student, for PAT Numeracy (yellow), NAPLAN's Numeracy (blue) and the achievement results for the subject Mathematics.
- The PAT Numeracy result indicates a high scaled score and significant improvement since the last PAT Numeracy test.
- The NAPLAN Numeracy result indicates a moderately high scaled score and a drop in performance since the last NAPLAN test.
The Mathematics subject score indicates that while this student has demonstrated aptitude in numeracy-related skills in standardised tests, this is not translating to strong achievement scores in Maths. Why is that?
Zone of Proximal Development
Knowing a student has aptitude is certainly useful for planning but to truly differentiate instruction and assessment, we need to see more clearly where their strengths and challenges sit within each strand of the domain. Enter Vygotsky's (remember him from the second year of your teaching degree?) Zone of Proximal Development (ZPD). A critical aspect of Vygotsky's theory of learning is that notion that we can support student growth by identifying the space between what a learner can do unsupervised and what a learner can do with teacher guidance. In practical terms, this is the point at which a student starts to get more questions wrong than right during a standardised test like NAPLAN.
The image below is an example of a Guttman chart produced automatically in our analytics. It helps teachers identify their students' ZPD and to see the specific skills they could target to individual students or small groups in their lesson planning. Along the left (rows) is each student in the cohort a teacher is looking at. At the top (columns) is each question from the standardised test, described by the curriculum descriptor. Students are sorted by most correct answers to least and questions are sorted left to right by most frequently answered correctly, to least. We end up with a kind of diagonal where green meets red. This diagonal is an approximation of this class's ZPD, however, we've calculated each student's ZPD, as indicated by the gold dots. These are the skills, which under teacher-guided instruction, would constitute a 'stretch goal' for the student.
Growth Indicators
Standardised tests provide with their data sets, useful benchmarking information, against which student growth can be objectively measured. Whether it's the state mean, the national median or their classmates, looking at a student's rate of growth can be made more meaningful when it's framed by context.
We've found that exactly what 'context' should be used differs from school to school! In a recent conversation, we determined that 'cohort' and 'like starting point' were critical to understanding a student's rate of growth. By this we mean, a sub-set of students who took the same tests in 2019 and 2021 and whose 2019 scaled scores indicated that in that they were in the same band of performance. So it's limiting the comparison group to those who had 'like' scores to begin with.
This is illustrated below in that upon hover on the arrow for this 'Reading' scaled score, we see that Milley showed growth of 26% when compared to students who started with scores similar to hers in the previous test.
Making it Meaningful
Automating the process of getting, collating and visualising NAPLAN, ACER, and other standardised testing data has meant that teachers and school leaders spend less time data wrangling and more time asking great questions. We've worked with our schools to really nail down those questions and to take the extra necessary steps to analyse and visualise the data in the way that they need to see it to address individual student needs.
***
This blog post has been reproduced with permission from SIMON's strategic analytics partner intellischool.co
The author of this insightful blog is Rowan Freeman, Intellischool's Product Manager.
SHARE