You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: examples/object-hunting/README.md
+69-42Lines changed: 69 additions & 42 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,12 +1,14 @@
1
1
# Object Hunting
2
2
3
-
The **Object Hunting** is an interactive scavenger hunt that uses real-time object detection. Players must locate specific physical objects in their environment using a USB camera connected to the Arduino UNO Q to win the game.
3
+
The **Object Hunting Game** is an interactive scavenger hunt that uses real-time object detection. Players must locate specific physical objects in their environment using a USB camera connected to the Arduino UNO Q to win the game.
4
+
5
+
**Note:** This example requires to be run using **Network Mode** or **Single-Board Computer (SBC) Mode**, since it requires a **USB-C® hub** and a **USB webcam**.
4
6
5
7

6
8
7
9
## Description
8
10
9
-
This App creates an interactive game that recognizes real-world objects. It utilizes the `video_objectdetection` Brick to stream video from a USB webcam and perform continuous inference. The web interface challenges the user to find five specific items: **Book, Bottle, Chair, Cup, and Cell Phone**.
11
+
This App creates an interactive game that recognizes real-world objects. It utilizes the `video_objectdetection` Brick to stream video from a USB webcam and perform continuous inference using the **YoloX Nano** model. The web interface challenges the user to find five specific items: **Book, Bottle, Chair, Cup, and Cell Phone**.
10
12
11
13
Key features include:
12
14
@@ -34,19 +36,19 @@ The object hunting game example uses the following Bricks:
34
36
### Software
35
37
36
38
- Arduino App Lab
37
-
38
-
**Important:** A **USB-C® hub is mandatory** for this example to connect the USB Webcam. Consequently, this example must be run in **[Network Mode](learn/network-mode)** or **[SBC Mode](learn/single-board-computer)**.
39
-
39
+
**Important:** A **USB-C® hub is mandatory** for this example to connect the USB Webcam.
40
40
**Note:** You must connect the USB camera **before** running the App. If the camera is not connected or not detected, the App will fail to start.
41
41
42
42
## How to Use the Example
43
43
44
44
1.**Hardware Setup**
45
45
Connect your **USB Webcam** to a powered **USB-C® hub** attached to the UNO Q. Ensure the hub is powered to support the camera.
*Note: If the App stops immediately after clicking Run, check your USB camera connection.*
51
+

50
52
51
53
3.**Access the Web Interface**
52
54
Open the App in your browser at `<UNO-Q-IP-ADDRESS>:7000`. The interface will load, showing the game introduction and the video feed placeholder.
@@ -67,18 +69,30 @@ The object hunting game example uses the following Bricks:
67
69
68
70
## How it Works
69
71
70
-
Once the App is running, it performs the following operations:
72
+
The application relies on a continuous data pipeline between the hardware, the inference engine, and the web browser.
73
+
74
+
**High-level data flow:**
75
+
76
+
```
77
+
USB Camera ──► VideoObjectDetection ──► Inference Model (YoloX)
78
+
│ │
79
+
│ (MJPEG Stream) │ (Detection Events)
80
+
▼ ▼
81
+
Frontend (Browser) ◄── WebUI Brick
82
+
│
83
+
└──► WebSocket (Threshold Control)
84
+
```
71
85
72
86
-**Video Streaming**: The `video_objectdetection` Brick captures video from the USB camera and hosts a low-latency stream on port `4912`. The frontend embeds this stream via an `<iframe>`.
73
-
-**Inference**: The backend continuously runs an object detection model on the video frames.
87
+
-**Inference**: The backend continuously runs the **YoloX Nano** object detection model on the video frames.
74
88
-**Event Handling**: When objects are detected, the backend sends the labels to the frontend via WebSockets.
75
-
-**Game Logic**: The frontend JavaScript compares the received labels against the target list (`['book', 'bottle', 'chair', 'cup', 'cell phone']`) and updates the game state.
89
+
-**Game Logic**: The frontend JavaScript compares the received labels against the target list and updates the game state.
76
90
77
91
## Understanding the Code
78
92
79
-
### 🔧 Backend
93
+
### 🔧 Backend (`main.py`)
80
94
81
-
The Python® script initializes the detection engine and bridges the communication between the computer vision model and the web UI.
95
+
The Python script initializes the detection engine and bridges the communication between the computer vision model and the web UI.
82
96
83
97
-**Initialization**: Sets up the WebUI and the Video Object Detection engine.
84
98
-**Threshold Control**: Listens for `override_th` messages from the UI to adjust how strict the model is when identifying objects.
-**Processing Detections**: When the backend sends a `detection` event, the script checks if the detected object is in the target list and hasn't been found yet. If it matches, it updates the UI and checks for a win condition.
The default model used by the `video_objectdetection` Brick is **YoloX Nano**, trained on the **COCO dataset**. This means the camera can detect approximately 80 different types of objects, not just the five used in this example.
145
+
146
+
**To change the objects you want to hunt:**
147
+
148
+
1.**Choose new targets**: You can select any object from the [standard COCO dataset list](https://github.com/amikelive/coco-labels/blob/master/coco-labels-2014_2017.txt) (e.g., `person`, `keyboard`, `mouse`, `backpack`, `banana`).
149
+
2.**Update the code**: Open `assets/app.js` and locate the `targetObjects` array:
4. (Optional) Update `assets/index.html` to change the icons and text displayed in the game introduction to match your new targets.
158
+
159
+
## Troubleshooting
160
+
161
+
### App fails to start or stops immediately
162
+
If the application crashes right after launching, it is likely because the **USB Camera** is not detected.
163
+
**Fix:**
164
+
1. Ensure the camera is connected to a **powered USB-C hub**.
165
+
2. Verify the hub has its external power supply connected (5 V, 3 A).
166
+
3. Reconnect the camera and try running the App again.
167
+
168
+
### Video stream is black or not loading
169
+
If the game interface loads but the video area remains black or shows "Searching Webcam...":
170
+
-**Browser Security:** Some browsers block mixed content or insecure frames. Ensure you are not blocking the iframe loading from port `4912`.
171
+
-**Network:** Ensure your computer and the UNO Q are on the same network.
172
+
-**Camera Status:** If the camera was disconnected while the App was running, you must restart the App.
173
+
174
+
### Objects are not being detected
175
+
If you are pointing the camera at an object but it doesn't register:
176
+
-**Check the list:** Ensure the object is one of the targets defined in `app.js`.
177
+
-**Adjust Confidence:** Lower the **Confidence Level** slider. If set too high (e.g., > 0.80), the model requires a perfect angle to trigger a detection.
178
+
-**Lighting:** Ensure the object is well-lit. Shadows or darkness can prevent detection.
179
+
-**Distance:** Move the camera closer or further away. The object should occupy a significant portion of the frame.
180
+
181
+
### "Connection to the board lost" error
182
+
If this message appears at the bottom of the screen:
183
+
- The WebSocket connection has been severed.
184
+
-**Fix:** Refresh the web page. If the error persists, check if the UNO Q is still powered on and connected to the network.
0 commit comments