Snap Inc. has secured a major milestone in Drones and Transportation Technologies with an outstanding invention that won Swanson Reed’s Patent of the Month for February 2026. This innovation focuses on a newly patented technology titled ‘Landing an autonomous drone with gestures’. The patent describes systems, computer readable medium and methods for landing an autonomous drone with gestures are disclosed. Example methods include lifting off the autonomous drone in response to an instruction from a person, receiving sensor data, and processing the sensor data to identify a gesture from the person that indicates that the autonomous drone is to land. The autonomous drone recognizes a gesture from a person to land where the gesture is based on a physical movement of the person. In response, the autonomous drone navigates to land the autonomous drone. In some examples, the person presents an open palm to the autonomous drone which causes the autonomous drone to fly to and land on the open palm. In some examples, the person places a hand under the autonomous drone which causes the autonomous drone to land. In some examples, the autonomous drone responds to the person that launched the autonomous drone.
Meeting U.S. R&D Tax Credit Rules
To qualify for the U.S. R&D Tax Credit, a project must pass the IRS’s four-part test. The development of this gesture-landing drone technology exemplifies these rules perfectly:
- Permitted Purpose: Snap Inc. sought to create a new or improved business component by engineering a drone with novel interactive flight control capabilities.
- Technological in Nature: The research heavily relies on the hard sciences, specifically computer science, optical engineering, and physics, to capture and interpret real-time physical data.
- Elimination of Uncertainty: At the project’s outset, there would have been significant technical uncertainty regarding how a drone’s software could accurately differentiate between an intentional landing gesture and random human movement without failing.
- Process of Experimentation: The engineering team would have undergone iterative testing—evaluating various computer vision algorithms, sensor configurations, and flight control responses—to achieve a reliable and safe landing sequence.
3 Practical Applications Qualifying for R&D Claims
In developing the technology described in this patent, the following practical engineering applications would likely meet the criteria for the R&D tax credit:
- Developing Dynamic Computer Vision Models: Engineering and training machine learning algorithms to accurately recognize specific hand gestures (like an open palm) across highly variable outdoor conditions, such as blinding sunlight, low-light environments, or complex moving backgrounds.
- Proximity and Aerodynamic Integration: Designing and testing the complex software-hardware loop required to translate a recognized visual gesture into an immediate aerodynamic response, ensuring the drone can safely reduce thrust and alter pitch to land smoothly on a human hand without causing injury.
- Launcher-Recognition Protocols: Creating and experimenting with continuous tracking or biometric identification software that ensures the drone only responds to the gestures of the specific individual who launched it, preventing interference from bystanders.