PAJ7620 - stodev-com-br/Tasmota GitHub Wiki
This feature is not included in precompiled binaries.
To use it you must compile your build. Add the following to user_config_override.h
:
#ifndef USE_PAJ7620
#define USE_PAJ7620 // PAJ7620 gesture sensor (I2C address 0x73) (+2.5k code)
#endif
PAJ7620U2 is an integrated gesture recognition I2C sensor from PixArt-Imaging Inc. based on infrared. It also has built-in proximity detection and can sense various properties like position (x,y,z) and speed.
Gesture recognition seems to be more stable than with the APDS-9960, which on the other hand is a lot cheaper.
Breakout | ESP8266 |
---|---|
VCC/VIN | +3.3VDC |
GND | GND |
SCL | GPIOy |
SDA | GPIOx |
INT | NC |
In the Configuration -> Configure Module page assign:
- GPIOx to
I2C SDA (6)
- GPIOy to
I2C SCL (5)
After a reboot the driver will detect the PAJ7620 automatically.
After restart Tasmota needs some time to completely configure its state. In this time frame it is likely to miss some gestures. This should stabilize after a few moments.
To use the sensor you need to switch to the desired mode of operation with
Sensor50 <x>
where <x> = 0…5
. It will not appear in the webUI but it can be observed via MQTT messages in console.
Sensor muted, no readings in Tasmota.
Reports gesture movement with:
Up
Down
Left
Right
Near
Far
CW
(clockwise rotation)
CCW
(counter-clockwise rotation)
As expected, "Near" and "Far" gestures are tricky and you have to train your movements to catch them. Sometimes the sensor reports "Near" and "Far" at once (which will be discarded).
There is some postprocessing to allow the object (hand or finger) to move into the sensing area and delay the initial direction report (up, down, left, right) to give the chance to trigger (the intended) "Near" or "Far" movement. Especially "Far" is a bit harder to achieve.
example:
…{Up:1}
= up gesture once
…{Left:3}
= left gesture 3 times in a row, without any other gesture in between
Arbitrary values between 0 (far away) and 255 (very near) are given. Exit from the sensor field will always give at least one "zero message". tele
is only triggered, when the value has changed.
example:
…{Proximity:255}
= close proximity, almost touching the sensor
…{Proximity:0}
= object has left the sensing area
Sensing area is organised in quarters. An object in one of the corners will trigger the corresponding number.
1 | 2 |
---|---|
3 | 4 |
example:
…{Corner:2}
= object in upper right corner
A fluent movement of an object through a given sequence of corners (similar to unlocking a smartphone) will trigger a valid "PIN". The next corner must be reached in about 0.7 seconds.
example:
…{PIN:1}
= valid PIN
Shows x- and y-coordinates. Mainly intended for debugging and "seeing" the sensing area. This reads only the upper 5-bit-values, which automatically removes much of the jitter, giving values between 0 and 15.
example:
…{x:1, y:15}
= upper left corner
The sensor provides some more goodies, like velocity of an object, so if someone has a fancy use case for this, feel free to open a feature request. Of course it would be possible to mix the modes, but this can produce a lot of examples. This could be added later upon user request (based on real world use cases.