Evaluating the "Using Euros" Project Results
A Comprehensive Analysis
Abstract
This study evaluates the effectiveness and usability of the produced results of the "Using Euros" project through comprehensive user feedback analysis. A total of 79 participants assessed the platform across 24 dimensions spanning technical functionality, pedagogical value, and learning experience. Utilizing a 5-point Likert scale, respondents evaluated aspects including interface usability, technical reliability, pedagogical integration, and learning outcomes. Statistical analysis identified highly correlated features and distinct factor clusters that characterize user experience. Results indicate overwhelmingly positive reception (mean scores >4.6 across all dimensions) with the strongest consensus regarding life skills development, learner engagement, and alignment with learning goals. Technical performance and user experience elements formed the most robust factor cluster, while parameter configuration and progress assessment emerged as distinct considerations. This research provides critical insights for educational technology developers and practitioners seeking to implement effective digital learning platforms that balance technical reliability with pedagogical value.
DESCRIPTIVE STATISTICS
count | mean | std | min | 25% | 50% | 75% | max | Question | |
---|---|---|---|---|---|---|---|---|---|
0 | 79.0 | 4.670886075949370 | 0.7462988847080340 | 1.0 | 5.0 | 5.0 | 5.0 | 5.0 | Web-site functions without technical glitches (1-5) |
1 | 79.0 | 4.7594936708860800 | 0.5596516744853900 | 3.0 | 5.0 | 5.0 | 5.0 | 5.0 | The instructions provided are clear (1-5) |
2 | 79.0 | 4.746835443037980 | 0.5654214318990420 | 3.0 | 5.0 | 5.0 | 5.0 | 5.0 | Interface is user friendly (1-5) |
3 | 79.0 | 4.772151898734180 | 0.505086108135003 | 3.0 | 5.0 | 5.0 | 5.0 | 5.0 | Quality of graphics (1-5) |
4 | 76.0 | 4.776315789473680 | 0.505929750341588 | 3.0 | 5.0 | 5.0 | 5.0 | 5.0 | Applications work on at least three browsers (1-5) |
5 | 76.0 | 4.723684210526320 | 0.531631055288692 | 3.0 | 5.0 | 5.0 | 5.0 | 5.0 | Popup and warning messages are helpful (1-5) |
6 | 78.0 | 4.743589743589740 | 0.6729639313369270 | 2.0 | 5.0 | 5.0 | 5.0 | 5.0 | Activities can easily be shared among other users (1-5) |
7 | 79.0 | 4.658227848101270 | 0.5748149300687360 | 3.0 | 4.0 | 5.0 | 5.0 | 5.0 | Setting the parameters is an easy process (1-5) |
8 | 79.0 | 4.69620253164557 | 0.6066813969699470 | 3.0 | 5.0 | 5.0 | 5.0 | 5.0 | Users easily run the Applications (1-5) |
9 | 79.0 | 4.8354430379746800 | 0.46493395579608000 | 3.0 | 5.0 | 5.0 | 5.0 | 5.0 | Scanning works as required (1-5) |
10 | 79.0 | 4.784810126582280 | 0.5230797869215380 | 3.0 | 5.0 | 5.0 | 5.0 | 5.0 | Printouts are of good quality (1-5) |
11 | 79.0 | 4.7974683544303800 | 0.490414446273684 | 3.0 | 5.0 | 5.0 | 5.0 | 5.0 | Universal Design of Learning is applied (1-5) |
12 | 79.0 | 4.886075949367090 | 0.31974917141296800 | 4.0 | 5.0 | 5.0 | 5.0 | 5.0 | Skills to be acquired are important life skills (1-5) |
13 | 79.0 | 4.848101265822790 | 0.4554121152779830 | 3.0 | 5.0 | 5.0 | 5.0 | 5.0 | Feedback helps teachers to monitor student's progress (1-5) |
14 | 79.0 | 4.822784810126580 | 0.4163176077654420 | 3.0 | 5.0 | 5.0 | 5.0 | 5.0 | Applications can be integrated in the curriculum (1-5) |
15 | 79.0 | 4.6835443037974700 | 0.5671409131068760 | 3.0 | 4.0 | 5.0 | 5.0 | 5.0 | Sample Lesson plans are usefull (1-5) |
16 | 79.0 | 4.860759493670890 | 0.34840947082971600 | 4.0 | 5.0 | 5.0 | 5.0 | 5.0 | Users are actively involved in the learning process (1-5) |
17 | 79.0 | 4.7594936708860800 | 0.45896175678393700 | 3.0 | 5.0 | 5.0 | 5.0 | 5.0 | Notes on Methodologies are helpful (1-5) |
18 | 79.0 | 4.772151898734180 | 0.5298613047505830 | 3.0 | 5.0 | 5.0 | 5.0 | 5.0 | Self evaluation motivates students (1-5) |
19 | 79.0 | 4.8734177215189900 | 0.4346259401911600 | 3.0 | 5.0 | 5.0 | 5.0 | 5.0 | Students can work on their own pace (1-5) |
20 | 78.0 | 4.846153846153850 | 0.36313651960128100 | 4.0 | 5.0 | 5.0 | 5.0 | 5.0 | Information provided relates to the learning goals (1-5) |
21 | 79.0 | 4.8734177215189900 | 0.3709685385649280 | 3.0 | 5.0 | 5.0 | 5.0 | 5.0 | Teachers/parents can easily adapt Activities to user's needs (1-5) |
22 | 79.0 | 4.772151898734180 | 0.4790312630611270 | 3.0 | 5.0 | 5.0 | 5.0 | 5.0 | Teachers can easily assess students' progress (1-5) |
23 | 79.0 | 4.69620253164557 | 0.627457841667814 | 3.0 | 5.0 | 5.0 | 5.0 | 5.0 | Activities can be used for group work as well as for individual work (1-5) |
High variation questions
Label | Question | std |
---|---|---|
A1 | Web-site functions without technical glitches (1-5) | 0.7462988847080340 |
A7 | Activities can easily be shared among other users (1-5) | 0.6729639313369270 |
C12 | Activities can be used for group work as well as for individual work (1-5) | 0.627457841667814 |
A9 | Users easily run the Applications (1-5) | 0.6066813969699470 |
A8 | Setting the parameters is an easy process (1-5) | 0.5748149300687360 |
Low variation questions
Label | Question | std |
---|---|---|
B1 | Skills to be acquired are important life skills (1-5) | 0.31974917141296800 |
C5 | Users are actively involved in the learning process (1-5) | 0.34840947082971600 |
C9 | Information provided relates to the learning goals (1-5) | 0.36313651960128100 |
C10 | Teachers/parents can easily adapt Activities to user's needs (1-5) | 0.3709685385649280 |
B3 | Applications can be integrated in the curriculum (1-5) | 0.4163176077654420 |
Understanding Variation in Responses (Standard Deviation)
The standard deviation (std) in the second chart represents how much the responses for each question deviate from the mean. Here's what it tells us:
Low Standard Deviation (Near 0) → High Agreement
If the standard deviation is close to 0, it means that most respondents gave very similar ratings (e.g., everyone rated a question as 5). This suggests strong consensus on the rating.
Example: If a question about "clarity of instructions" has a std = 0, it means everyone agreed on a particular score (all rated it as 5 or all as 4, etc.).
High Standard Deviation (Above 0.5) → More Diverse Opinions
A higher standard deviation means responses varied significantly among respondents. This suggests mixed opinions—some rated it high, while others rated it low.
Example: If a question about "interface friendliness" has std = 0.8, it indicates some people found it very user-friendly (5), while others rated it lower (3-4), showing disagreement.
Why is Variation Important?
- Low variation → Indicates strong consensus (either the feature is really good or really bad).
- High variation → Identifies areas of improvement (some users are satisfied, while others are not, showing usability concerns).
Questions with high standard deviations might need further investigation to understand why opinions are divided.
Category Question
Category | Question |
---|---|
A1 | Web-site functions without technical glitches (1-5) |
A2 | The instructions provided are clear (1-5) |
A3 | Interface is user friendly (1-5) |
A4 | Quality of graphics (1-5) |
A5 | Applications work on at least three browsers (1-5) |
A6 | Popup and warning messages are helpful (1-5) |
A7 | Activities can easily be shared among other users (1-5) |
A8 | Setting the parameters is an easy process (1-5) |
A9 | Users easily run the Applications (1-5) |
A10 | Scanning works as required (1-5) |
A11 | Printouts are of good quality (1-5) |
A12 | Universal Design of Learning is applied (1-5) |
B1 | Skills to be acquired are important life skills (1-5) |
B2 | Feedback helps teachers to monitor student's progress (1-5) |
B3 | Applications can be integrated in the curriculum (1-5) |
B4 | Sample Lesson plans are useful (1-5) |
C5 | Users are actively involved in the learning process (1-5) |
C6 | Notes on Methodologies are helpful (1-5) |
C7 | Self evaluation motivates students (1-5) |
C8 | Students can work on their own pace (1-5) |
C9 | Information provided relates to the learning goals (1-5) |
C10 | Teachers/parents can easily adapt Activities to user's needs (1-5) |
C11 | Teachers can easily assess students' progress (1-5) |
C12 | Activities can be used for group work as well as for individual work (1-5) |
CORRELATIONS BETWEEN QUESTIONS
Strongest correlations between questions (correlation above 0.7), indicating that responses to these questions are highly related.
Label 1 | Question 1 | Question 1 Text | Label 2 | Question 2 | Question 2 Text | Correlation |
---|---|---|---|---|---|---|
A1 | 0 | Web-site functions without technical glitches (1-5) | C5 | 16 | Users are actively involved in the learning process (1-5) | 0.7090125015101340 |
A2 | 1 | The instructions provided are clear (1-5) | A9 | 8 | Users easily run the Applications (1-5) | 0.7260350618403750 |
A3 | 2 | Interface is user friendly (1-5) | A6 | 5 | Popup and warning messages are helpful (1-5) | 0.781614675979465 |
A4 | 3 | Quality of graphics (1-5) | A11 | 10 | Printouts are of good quality (1-5) | 0.734028376223487 |
A4 | 3 | Quality of graphics (1-5) | B1 | 12 | Skills to be acquired are important life skills (1-5) | 0.7104325775692570 |
A6 | 5 | Popup and warning messages are helpful (1-5) | B3 | 14 | Applications can be integrated in the curriculum (1-5) | 0.7192552903761100 |
A7 | 6 | Activities can easily be shared among other users (1-5) | A9 | 8 | Users easily run the Applications (1-5) | 0.7548860451594470 |
A9 | 8 | Users easily run the Applications (1-5) | A11 | 10 | Printouts are of good quality (1-5) | 0.7609438667598360 |
A10 | 9 | Scanning works as required (1-5) | A11 | 10 | Printouts are of good quality (1-5) | 0.7487071134303000 |
A10 | 9 | Scanning works as required (1-5) | A12 | 11 | Universal Design of Learning is applied (1-5) | 0.7516016138604400 |
C6 | 17 | Notes on Methodologies are helpful (1-5) | C9 | 20 | Information provided relates to the learning goals (1-5) | 0.7040041483986320 |
CLUSTER ANALYSIS
Identified clusters of questions based on the Exploratory Factor Analysis (EFA):
Clusters & Their Grouped Questions:
Factor 1 (Largest Cluster):
- Focuses on usability, user experience, and learning engagement.
- Includes questions about interface friendliness, clarity of instructions, ease of use, feedback, and self-paced learning.
Factor 2:
- Relates to technical performance & accessibility.
- Includes questions about graphics quality, browser compatibility, helpful warnings, print quality, and lesson plans.
Factor 3:
Contains only one question related to teachers assessing student progress.
Factor 4:
Also contains only one question, focused on ease of setting parameters.
Dominant Factor | Questions in Cluster |
---|---|
Factor 1 | [['A1', 'Web-site functions without technical glitches (1-5)'], ['A2', 'The instructions provided are clear (1-5)'], ['A3', 'Interface is user friendly (1-5)'], ['A7', 'Activities can easily be shared among other users (1-5)'], ['A9', 'Users easily run the Applications (1-5)'], ['A10', 'Scanning works as required (1-5)'], ['A12', 'Universal Design of Learning is applied (1-5)'], ['B1', 'Skills to be acquired are important life skills (1-5)'], ['B2', "Feedback helps teachers to monitor student's progress (1-5)"], ['B3', 'Applications can be integrated in the curriculum (1-5)'], ['C5', 'Users are actively involved in the learning process (1-5)'], ['C7', 'Self evaluation motivates students (1-5)'], ['C8', 'Students can work on their own pace (1-5)'], ['C9', 'Information provided relates to the learning goals (1-5)'], ['C10', "Teachers/parents can easily adapt Activities to user's needs (1-5)"]] |
Factor 2 | [['A4', 'Quality of graphics (1-5)'], ['A5', 'Applications work on at least three browsers (1-5)'], ['A6', 'Popup and warning messages are helpful (1-5)'], ['A11', 'Printouts are of good quality (1-5)'], ['B4', 'Sample Lesson plans are usefull (1-5)'], ['C6', 'Notes on Methodologies are helpful (1-5)'], ['C12', 'Activities can be used for group work as well as for individual work (1-5)']] |
Factor 3 | [['C11', "Teachers can easily assess students' progress (1-5)"]] |
Factor 4 | [['A8', 'Setting the parameters is an easy process (1-5)']] |
What Does This Mean?
Factor 1 suggests a strong relationship between usability, learning effectiveness, and user engagement.
Factor 2 indicates that graphics quality, browser compatibility, and structured lesson plans are linked together.
Factors 3 & 4 are outliers, indicating standalone concerns that don't strongly group with others.
Results
The assessment of the educational technology platform yielded consistently positive evaluations across all 24 dimensions measured (N=79). Descriptive statistics revealed high mean scores ranging from 4.67 to 4.89 on a 5-point Likert scale, indicating strong user satisfaction with both technical and pedagogical aspects of the platform.
Descriptive Analysis
The highest-rated features were "Skills to be acquired are important life skills" (M=4.89, SD=0.32), "Students can work at their own pace" (M=4.87, SD=0.43), and "Teachers/parents can easily adapt activities to user's needs" (M=4.87, SD=0.37), demonstrating particular appreciation for the platform's pedagogical flexibility and relevance. Technical aspects were also well-received, with "Scanning works as required" (M=4.84, SD=0.46) and "Printouts are of good quality" (M=4.78, SD=0.52) receiving strong positive evaluations.
Analysis of response variation revealed important insights into user consensus. The strongest agreement among respondents (indicated by the lowest standard deviations) centered on core pedagogical features: "Skills to be acquired are important life skills" (SD=0.32), "Users are actively involved in the learning process" (SD=0.35), and "Information provided relates to the learning goals" (SD=0.36). Conversely, greater variation in responses (indicating less consensus) was observed for technical features such as "Website functions without technical glitches" (SD=0.75), "Activities can easily be shared among other users" (SD=0.67), and "Activities can be used for group work as well as individual work" (SD=0.63).
Correlation Analysis
Correlation analysis identified 11 strong relationships (r > 0.70) between different evaluation dimensions. The strongest correlations were observed between "Interface is user friendly" and "Popup and warning messages are helpful" (r=0.78), "Users easily run the applications" and "Printouts are of good quality" (r=0.76), and "Activities can easily be shared among other users" and "Users easily run the applications" (r=0.75). These correlations suggest that user perception of interface quality is closely linked to the helpfulness of system feedback, and that ease of operation correlates strongly with output quality.
Notable cross-category correlations included "Quality of graphics" with "Skills to be acquired are important life skills" (r=0.71) and "Website functions without technical glitches" with "Users are actively involved in the learning process" (r=0.71), indicating that technical performance aspects positively influence perceptions of pedagogical value.
Factor Analysis
Exploratory Factor Analysis revealed four distinct clusters of platform characteristics:
Factor 1 (User Experience and Learning Engagement): The largest cluster, comprising 15 items including interface usability, technical reliability, learning relevance, and user autonomy. This dominant factor suggests a strong interconnection between technical functionality and pedagogical effectiveness, where ease of use directly supports learning engagement.
Factor 2 (Technical Performance and Accessibility): Consisting of 7 items related to visual quality, cross-browser compatibility, system feedback, and instructional resources. This cluster indicates that output quality and structured guidance form a coherent aspect of the platform experience.
Factor 3 (Assessment Capability): Containing the single item "Teachers can easily assess students' progress," this factor emerged as a distinct consideration separate from other pedagogical features, highlighting the unique importance of assessment functionality.
Factor 4 (Parameter Configuration): Also containing a single item, "Setting the parameters is an easy process," this factor's isolation suggests that parameter configuration represents a specialized aspect of the platform experience that warrants dedicated attention.
In summary, the results demonstrate high overall satisfaction with the educational technology platform, with particularly strong consensus regarding its pedagogical value. The factor analysis reveals that while user experience and learning engagement form a cohesive core experience, specific functionalities like assessment and parameter configuration represent distinct aspects that merit specialized consideration in platform development and implementation.
Please help us spread the word about this project by using one of the links below to share this post on social networks!
0 Comments