Agricultural Robot ALLY

Technologies: On-board decision making, Optical Navigation, AI, SWARM Intelligence, Obstacle avoidance

 

Challenge: Optimization of costs for irrigation, spraying and maintenance of agricultural land.

 

Solution Produced: ALLY is an autonomous ground agricultural drone that collects and analyzes information on the state of the soil, provides irrigation in certain areas in accordance with soil data, and, if necessary, sprays against pests. 

 

Killer Feature: The ALLY does not crush the ground, thereby not disturbing the plants. 

Autopilot and navigation are performed by cameras.

ALLY works are transported in containers. The container has everything you need to operate 10 robots for 10 years.

 

Step Sensor

Technologies: sensors, strain gauges, solar battery, STM32, Raspberry Pi, Google Coral, LoRa, LoRa Mesh, and GPS.

 

Challenge: Enhanced security upgrade of critical assets

 

How It Works:

  • Sensors are installed in the ground within the protected object working range.
  • When a moving object is approaching, the sensor detects the soil vibrations and transmits the object location.
  • Coordinates and direction of a moving object are displayed on a digital map.

 

Killer Feature: Human steps are detected from a distance up to 300m, passage of vehicles up to 1 km. Step sensors work in autonomous mode for up to 3 months. The coverage for data transmission reaches 20 km.  

Ultrasonic Weather Station

Technologies: GPS, LoRa Mesh, Bluetooth, magnetic compass, air pressure sensor, air temperature and humidity sensor, CO  sensor, CO2 sensor, and Ultrasonic transducer.

  

Challenge: Environmental parameters determination under elevated pollution episodes

 

How It Works: 

  • The weather station is installed in an open area.
  • The environmental parameters are determined using a magnetic compass and sensors that are installed on the device.
  • The interactive map displays the weather station location and real-time data.

 

Killer Feature: The device can accurately work in high dusty conditions due to the ultrasonic transducers measurements. Any other sensors can be easily connected to the weather station system.

The Turtle

Technologies: Optical SWARM navigation, LoRa MESH, STM32 motor control, Nvidia Jetson Nano, Embedded Machine Vision, Machine Learning, Obstacle avoidance.

 

Challenge: An exploration and navigation in hard-to-reach places without GPS and Internet access

 

How It Works: 

  • User sends a Lead drone to the mission by gestures or voice.
  • The SWARM of drones follows the  Lead drone according to SWARM configuration.
  • The Lead Drone communicates with the SWARM elements via the LoRa MESH network.
  • The SWARM has a hive mind, so the data is freely transferred between the elements.
  • The SWARM provides the digital map of the explored area.

 

 

Killer Feature: The drones detect and avoid obstacles along with building digital maps applying a SWARM hive minding, Indoor Navigation technologies, unique algorithms of gesture recognition, and voice control.