Skip to content

Commit 396b317

Browse files
committed
Update object hunting docs
1 parent 830a815 commit 396b317

File tree

4 files changed

+69
-42
lines changed

4 files changed

+69
-42
lines changed

examples/object-hunting/README.md

Lines changed: 69 additions & 42 deletions
Original file line numberDiff line numberDiff line change
@@ -1,12 +1,14 @@
11
# Object Hunting
22

3-
The **Object Hunting** is an interactive scavenger hunt that uses real-time object detection. Players must locate specific physical objects in their environment using a USB camera connected to the Arduino UNO Q to win the game.
3+
The **Object Hunting Game** is an interactive scavenger hunt that uses real-time object detection. Players must locate specific physical objects in their environment using a USB camera connected to the Arduino UNO Q to win the game.
4+
5+
**Note:** This example requires to be run using **Network Mode** or **Single-Board Computer (SBC) Mode**, since it requires a **USB-C® hub** and a **USB webcam**.
46

57
![Object Hunting Game Example](assets/docs_assets/thumbnail.png)
68

79
## Description
810

9-
This App creates an interactive game that recognizes real-world objects. It utilizes the `video_objectdetection` Brick to stream video from a USB webcam and perform continuous inference. The web interface challenges the user to find five specific items: **Book, Bottle, Chair, Cup, and Cell Phone**.
11+
This App creates an interactive game that recognizes real-world objects. It utilizes the `video_objectdetection` Brick to stream video from a USB webcam and perform continuous inference using the **YoloX Nano** model. The web interface challenges the user to find five specific items: **Book, Bottle, Chair, Cup, and Cell Phone**.
1012

1113
Key features include:
1214

@@ -34,19 +36,19 @@ The object hunting game example uses the following Bricks:
3436
### Software
3537

3638
- Arduino App Lab
37-
38-
**Important:** A **USB-C® hub is mandatory** for this example to connect the USB Webcam. Consequently, this example must be run in **[Network Mode](learn/network-mode)** or **[SBC Mode](learn/single-board-computer)**.
39-
39+
**Important:** A **USB-C® hub is mandatory** for this example to connect the USB Webcam.
4040
**Note:** You must connect the USB camera **before** running the App. If the camera is not connected or not detected, the App will fail to start.
4141

4242
## How to Use the Example
4343

4444
1. **Hardware Setup**
4545
Connect your **USB Webcam** to a powered **USB-C® hub** attached to the UNO Q. Ensure the hub is powered to support the camera.
46+
![Hardware setup](assets/docs_assets/hardware-setup.png)
4647

4748
2. **Run the App**
4849
Launch the App from Arduino App Lab.
4950
*Note: If the App stops immediately after clicking Run, check your USB camera connection.*
51+
![Arduino App Lab - Run App](assets/docs_assets/launch-app.png)
5052

5153
3. **Access the Web Interface**
5254
Open the App in your browser at `<UNO-Q-IP-ADDRESS>:7000`. The interface will load, showing the game introduction and the video feed placeholder.
@@ -67,18 +69,30 @@ The object hunting game example uses the following Bricks:
6769

6870
## How it Works
6971

70-
Once the App is running, it performs the following operations:
72+
The application relies on a continuous data pipeline between the hardware, the inference engine, and the web browser.
73+
74+
**High-level data flow:**
75+
76+
```
77+
USB Camera ──► VideoObjectDetection ──► Inference Model (YoloX)
78+
│ │
79+
│ (MJPEG Stream) │ (Detection Events)
80+
▼ ▼
81+
Frontend (Browser) ◄── WebUI Brick
82+
83+
└──► WebSocket (Threshold Control)
84+
```
7185

7286
- **Video Streaming**: The `video_objectdetection` Brick captures video from the USB camera and hosts a low-latency stream on port `4912`. The frontend embeds this stream via an `<iframe>`.
73-
- **Inference**: The backend continuously runs an object detection model on the video frames.
87+
- **Inference**: The backend continuously runs the **YoloX Nano** object detection model on the video frames.
7488
- **Event Handling**: When objects are detected, the backend sends the labels to the frontend via WebSockets.
75-
- **Game Logic**: The frontend JavaScript compares the received labels against the target list (`['book', 'bottle', 'chair', 'cup', 'cell phone']`) and updates the game state.
89+
- **Game Logic**: The frontend JavaScript compares the received labels against the target list and updates the game state.
7690

7791
## Understanding the Code
7892

79-
### 🔧 Backend
93+
### 🔧 Backend (`main.py`)
8094

81-
The Python® script initializes the detection engine and bridges the communication between the computer vision model and the web UI.
95+
The Python script initializes the detection engine and bridges the communication between the computer vision model and the web UI.
8296

8397
- **Initialization**: Sets up the WebUI and the Video Object Detection engine.
8498
- **Threshold Control**: Listens for `override_th` messages from the UI to adjust how strict the model is when identifying objects.
@@ -105,53 +119,66 @@ def send_detections_to_ui(detections: dict):
105119
detection_stream.on_detect_all(send_detections_to_ui)
106120
```
107121

108-
### 🔧 Frontend
109-
110-
The web interface handles the game logic, video embedding, and user feedback.
122+
### 🔧 Frontend (`app.js`)
111123

112-
- **Video Embedding**: The HTML loads the raw video stream provided by the backend on a specific port.
113-
114-
```javascript
115-
const targetPort = 4912;
116-
const streamUrl = `http://${currentHostname}:${targetPort}/embed`;
117-
```
118-
119-
- **Game State Management**: The JavaScript defines the targets and tracks progress.
124+
The web interface handles the game logic. It defines the specific objects required to win the game.
120125

121126
```javascript
122127
const targetObjects = ['book', 'bottle', 'chair', 'cup', 'cell phone'];
123128
let foundObjects = [];
124-
```
125129

126-
- **Processing Detections**: When the backend sends a `detection` event, the script checks if the detected object is in the target list and hasn't been found yet. If it matches, it updates the UI and checks for a win condition.
127-
128-
```javascript
129130
function handleDetection(detection) {
130131
const detectedObject = detection.content.toLowerCase();
131132

132133
// Check if the detected item is a target and not yet found
133134
if (targetObjects.includes(detectedObject) && !foundObjects.includes(detectedObject)) {
134135
foundObjects.push(detectedObject);
135-
136-
// Update UI to show item as found
137-
const foundItem = document.getElementById(`obj-${detectedObject}`);
138-
foundItem.classList.add('found');
139-
140136
updateFoundCounter();
141137
checkWinCondition();
142138
}
143139
}
144140
```
145141

146-
- **Win Condition**:
147-
148-
```javascript
149-
function checkWinCondition() {
150-
if (foundObjects.length === targetObjects.length) {
151-
gameStarted = false;
152-
// Hide video, show win screen
153-
videoFeedContainer.classList.add('hidden');
154-
winScreen.classList.remove('hidden');
155-
}
156-
}
157-
```
142+
### 🛠️ Customizing the Game
143+
144+
The default model used by the `video_objectdetection` Brick is **YoloX Nano**, trained on the **COCO dataset**. This means the camera can detect approximately 80 different types of objects, not just the five used in this example.
145+
146+
**To change the objects you want to hunt:**
147+
148+
1. **Choose new targets**: You can select any object from the [standard COCO dataset list](https://github.com/amikelive/coco-labels/blob/master/coco-labels-2014_2017.txt) (e.g., `person`, `keyboard`, `mouse`, `backpack`, `banana`).
149+
2. **Update the code**: Open `assets/app.js` and locate the `targetObjects` array:
150+
```javascript
151+
const targetObjects = ['book', 'bottle', 'chair', 'cup', 'cell phone'];
152+
```
153+
3. **Replace the items**: Substitute the strings with your chosen object names from the COCO list.
154+
```javascript
155+
const targetObjects = ['person', 'keyboard', 'mouse', 'laptop', 'backpack'];
156+
```
157+
4. (Optional) Update `assets/index.html` to change the icons and text displayed in the game introduction to match your new targets.
158+
159+
## Troubleshooting
160+
161+
### App fails to start or stops immediately
162+
If the application crashes right after launching, it is likely because the **USB Camera** is not detected.
163+
**Fix:**
164+
1. Ensure the camera is connected to a **powered USB-C hub**.
165+
2. Verify the hub has its external power supply connected (5 V, 3 A).
166+
3. Reconnect the camera and try running the App again.
167+
168+
### Video stream is black or not loading
169+
If the game interface loads but the video area remains black or shows "Searching Webcam...":
170+
- **Browser Security:** Some browsers block mixed content or insecure frames. Ensure you are not blocking the iframe loading from port `4912`.
171+
- **Network:** Ensure your computer and the UNO Q are on the same network.
172+
- **Camera Status:** If the camera was disconnected while the App was running, you must restart the App.
173+
174+
### Objects are not being detected
175+
If you are pointing the camera at an object but it doesn't register:
176+
- **Check the list:** Ensure the object is one of the targets defined in `app.js`.
177+
- **Adjust Confidence:** Lower the **Confidence Level** slider. If set too high (e.g., > 0.80), the model requires a perfect angle to trigger a detection.
178+
- **Lighting:** Ensure the object is well-lit. Shadows or darkness can prevent detection.
179+
- **Distance:** Move the camera closer or further away. The object should occupy a significant portion of the frame.
180+
181+
### "Connection to the board lost" error
182+
If this message appears at the bottom of the screen:
183+
- The WebSocket connection has been severed.
184+
- **Fix:** Refresh the web page. If the error persists, check if the UNO Q is still powered on and connected to the network.
66 KB
Loading
94.2 KB
Loading
-161 KB
Loading

0 commit comments

Comments
 (0)