Fleiss Kappa Calculator & Visualisation of Video Annotations

Versión 1.1.0 (41,3 KB) por Jenil Shah
This tool creates a visualization of the Annotated Video Matrix & Calculates Fleiss Kappa Value, Confidence Interval & Significance Test.
58 descargas
Actualizado 11 jul 2020

Fleiss' kappa (named after Joseph L. Fleiss) is a statistical measure for assessing the reliability of agreement between a fixed number of raters when assigning categorical ratings to a number of items or classifying items. Fleiss' kappa works for any number of raters giving categorical ratings, to a fixed number of items. It can be interpreted as expressing the extent to which the observed amount of agreement among raters exceeds what would be expected if all raters made their ratings completely randomly.

This tool was created by Jenil Shah for usage in DEVIATE Research @UMTRI to calculate the inter-rater reliability for annotated videos due to absence of any existing tool in Matlab implamentation. The tool does the following:

1.) Create a Visualisation for the Labelled Matrix to get a clearer understanding of the labellings.
2.) Converts a labeled video matrix into a Fleiss Matrix.
3.) Calculate the Overall Fleiss Kappa Score & Percent Overall Agreement among raters above chance.

Usage & Examples

The tool expects the annotations as an nxm matrix where n = number of labelers & m = frames. The (i,j) entry represents the category {1,2,3,4...} the ith labeler put for the jth frame.
Note: Do not put 0 as a category.

Example input:

Following is an Annotation Matrix for 3 labelers annotate 9 frames for Head Position {3: Midline, 4: Weak Up, 5: Strong Up}
Usage:
To create the visualisation & calculate the Fleiss Kappa Value:

fleiss(#Number of Labellers, AnnotationMatrix, Significance Level);
To only create the Fleiss Kappa Matrix:

Create_Fleiss_Matrix(#Number of Labellers, AnnotationMatrix);
To only calculate Fleiss Kappa Score from Fleiss Kappa Matrix:

fleiss_score(FleissKappaMatrix, Significance Level);
Outputs:

Visualization of Frame labels into categories by multiple Labellers:
Fleiss Kappa Outputs:

Percent Overall Agreement: 0.7613
Overall Fleiss Kappa Score: 0.6045
Substantial agreement by Landis & Koch(1997)

Fleiss_Kappa Std_Error Confidence_Interval z p_value
____________ _________ ___________________ ____ _______

0.6045 0.042272 0.58252 0.62648 14.3 0

Reject null hypothesis: Observed agreement is not accidental

Citar como

Jenil Shah (2024). Fleiss Kappa Calculator & Visualisation of Video Annotations (https://github.com/jenilshah990/FleissKappaCalculator-VisulationOfVideoAnnotation/releases/tag/1.1.0), GitHub. Recuperado .

Compatibilidad con la versión de MATLAB
Se creó con R2020a
Compatible con cualquier versión
Compatibilidad con las plataformas
Windows macOS Linux

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!
Versión Publicado Notas de la versión
1.1.0

See release notes for this release on GitHub: https://github.com/jenilshah990/FleissKappaCalculator/releases/tag/1.1.0

1.0.0

Para consultar o notificar algún problema sobre este complemento de GitHub, visite el repositorio de GitHub.
Para consultar o notificar algún problema sobre este complemento de GitHub, visite el repositorio de GitHub.