【Introduction】: GitHub Homepage https://github.com/victordibi... 1. Introduction handtrack.js 2. Application scenarios If you are interested in gesture-based interactive experiences, Some relevant application scenarios:
3. Usage You can include the library URL directly in the 3.1 Using the script tag The minified js file of <script src="https://cdn.jsdelivr.net/npm/handtrackjs/dist/handtrack.min.js"> </script> <img id="img" src="hand.jpg"/> <canvas id="canvas" class="border"></canvas> Once you have added the above <script> const img = document.getElementById('img'); const canvas = document.getElementById('canvas'); const context = canvas.getContext('2d'); // Load the model. handTrack.load().then(model => { model.detect(img).then(predictions => { console.log('Predictions: ', predictions); }); }); </script> The above code snippet will print out the predicted bounding boxes for the image passed in via the 3.2 Using NPM You can install handtrack.js as an npm package using the following command: npm install --save handtrackjs You can then import and use the example in your web application: import * as handTrack from 'handtrackjs'; const img = document.getElementById('img'); // Load the model. handTrack.load().then(model => { // detect objects in the image. console.log("model loaded") model.detect(img).then(predictions => { console.log('Predictions: ', predictions); }); }); 3.3 Handtrack.js API const modelParams = { flipHorizontal: true, // flip eg for video imageScaleFactor: 0.7, // reduce input image size for gains in speed. maxNumBoxes: 20, // maximum number of boxes to detect iouThreshold: 0.5, // ioU threshold for non-max suppression scoreThreshold: 0.79, // confidence threshold for predictions. } handTrack.load(modelParams).then(model => { }); The An array of bounding boxes with class names and confidences. model.detect(img).then(predictions => { }); The prediction result format is as follows: [{ bbox: [x, y, width, height], class: "hand", score: 0.8380282521247864 }, { bbox: [x, y, width, height], class: "hand", score: 0.74644153267145157 }] Handtrack.js also provides other methods:
4. The next step is computationally expensive, mainly because neural network operations are required when predicting bounding boxes, which is a point that needs to be improved and optimized later; Tracking IDs across frames: assign an ID to each object as it enters a frame and continue tracking it; Add some discrete postures: for example, not just hands, but detect open hands and fists). 5. References Source code of Handtrack.js library: https://github.com/victordibi... Online Demo: https://victordibia.github.io... Egohands dataset: http://vision.soic.indiana.ed… This is the end of this article about the Handtrack.js library for real-time monitoring of hand movements (recommended). For more JS library content related to monitoring hand movements, please search for previous articles on 123WORDPRESS.COM or continue to browse the following related articles. I hope everyone will support 123WORDPRESS.COM in the future! You may also be interested in:
|
<<: Detailed steps for installing and configuring mysql 5.6.21
1. MIME: Multipurpose Internet Mail Extensions Th...
If this is the first time you install MySQL on yo...
1|0MySQL (MariaDB) 1|11. Description MariaDB data...
1. Mathematical Functions ABS(x) returns the abso...
question When I was writing a project function to...
Nginx can generally be used for seven-layer load ...
Table of contents background Achieve a similar ef...
View the nginx configuration file path Through ng...
1. Function: xargs can convert the data separated...
Implementation ideas: Use text-shadow in CSS to a...
1. What is responsive design? Responsive design i...
The system environment is server2012 1. Download ...
Keepalive is often used for caching in Vue projec...
1. Problem description <br />When JS is use...
This morning I planned to use Wampserver to build...