File size: 3,535 Bytes
0b2932f
 
 
 
 
 
 
 
 
 
 
 
 
 
 
5f7c6ec
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
0b2932f
5f7c6ec
 
 
 
 
 
 
 
 
 
 
 
0b2932f
5f7c6ec
 
 
0b2932f
5f7c6ec
 
 
 
 
0b2932f
5f7c6ec
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
---
license: mit
task_categories:
- question-answering
language:
- en
- uk
tags:
- Screen Representation
- MacOS
- UI
pretty_name: UI Parsing and Accessibility Dataset (UiPad)
size_categories:
- 1K<n<10K
---

# UiPad - UI Parsing and Accessibility Dataset

- Curated by: [MacPaw Inc.](https://huggingface.co/MacPaw)
- Language(s): Mostly EN, UA
- License: MIT

**Overview**
UiPad is a dataset created for the [IASA Champ 2024 Challenge](http://champ.iasa.kpi.ua/), focusing on the accessibility and interface understanding of MacOS applications. With growing interest in AI-driven user interface analysis, the dataset aims to bridge the gap in available resources for desktop app accessibility. While mobile apps and web platforms benefit from datasets like RICO and Mind2Web, MacOS apps remain mostly underexplored, particularly regarding accessibility parsing and textual representation.

![image.png](https://cdn-uploads.huggingface.co/production/uploads/660c2657b481e58759ac95ad/3g5kKC1pzqRF7Yi7xnRSk.png)


## Dataset Structure
UiPad contains 352 unique screens from 63 different MacOS applications. Of these, 68% include accessibility data in the form of JSON trees. A screenshot accompanies each app screen and, if available, a JSON file detailing the accessibility elements.

```plaintext
dataset
β”‚
β”œβ”€β”€ application/
β”‚ β”œβ”€β”€ screen_state_id (no accessibility) /
β”‚ β”‚ β”œβ”€β”€ screenshot.png
β”‚ β”œβ”€β”€ screen_state_id (with accessibility) /
β”‚ β”‚ β”œβ”€β”€ screenshot.png
β”‚ β”‚ β”œβ”€β”€ accessibility_tree.json
β”‚ ...
```

**Screenshot**
PNG image of the app screen

**Accessibility Tree Data**
 The accessibility tree captures essential UI elements such as:
 - `name`: Element name
 - `role`: The role of the UI element (e.g., button, image)
 - `description` and `role_description`
 - `value`: Element state or value
 - `children`: Nested UI components
 - `bbox` and `visible_bbox`: Bounding box coordinates of elements


![image.png](https://cdn-uploads.huggingface.co/production/uploads/660c2657b481e58759ac95ad/GZV0ClEBKaqQ2RhZ8J2lA.png)

**Questions (for evaluation)**

 The dataset includes several types of questions to evaluate UI understanding:
 - Numeric: "How many checkboxes are checked on the screen?" (485 instances)
 - Yes/No: "Is there a '+' button on the screen?" (306 instances)
 - String: "What is the name of the app on the screen?" (143 instances)
 - Coordinate: "Where do I click to connect Gmail?" (122 instances)

The dataset provides real-world challenges in accessibility recognition. Some screens may lack full accessibility support, with common issues like misidentifying roles (e.g., a button as an image), inaccurate bboxes or missing selected states.


![image.png](https://cdn-uploads.huggingface.co/production/uploads/660c2657b481e58759ac95ad/dcnmJ8C0FpawfIE2Mu9RX.png)

**Task and Objectives**

UiPad's primary goal is to create an AI agent that understands and enhances UI accessibility in MacOS scenarios. The quality of the generated UI representation and the effectiveness of the AI agent are measured using Question Answering tasks related to UI understanding.

**Limitations and Challenges**

- Accessibility data may be incomplete, redundant or missing.
- The dataset size is limited, which may not be sufficient for training models from scratch.
- Human labelling of the Q/A introduces the potential for errors.

**Dataset Card Contact**

Feel free to reach out tech-research@macpaw.com if you have any questions or need further information about the dataset!