Arcalabs DroneInSphero

Arcalabs DroneToSpheroEvolution

 

 

 

 

 

 

 

 

 

 

 

 

AR.Drone2.0 VS SPHERO


You may know the AR.Drone built by Parrot ?! You may also know Sphero, the robotic ball invented by Orbotics ?! We've decided to combine their functionalities to provide a new drone control experienc ! All this of course using LabVIEW!

 

Parrot's AR.Drone :

This drone is usually controlled using a smartphone running Android or iOS. Nevertheless, Parrot's engineers published on their website documentation about drone control.

 

Sphero : 

Sphero is usually controlled using a smartphone running Android or iOS. Orbotics' engineers also opened Sphero's firmware code and gives many details about it on their website.

 

The challenge :

Many projects have been achieved using these two devices. But this one combines ARDrone, Sphero and NI LabVIEW power all together !!

The main goal of our project is to control the ARDrone using Sphero. Sphero won't be used as a moving ball but it will be used as the drone remote control !

LabVIEW will be used to make these two objects to communicate and display all data necessary to control the AR.Drone !

 

 First tests :

Sphero :

No LabVIEW driver allowing to control the Sphero was existing when we started the project. So we've built our own driver according to data found in the Orbotix's documentation about bluetooth frames exchanged between the Sphero and a controller.

We found out that Sphero communicates both synchronously and asynchronously. Each synchronous request sent to Sphero implies a direct answer from the device ('handshaking' system). Asynchronous requests are only received from Sphero when necessary. So you have to 'subscribe' to a specific item (for example 'collision data') and Sphero will send a 'collision frame' only when a collision is detected.

 

We developped a communication layer able to handle synchronous and asynchronous frames sent on the bluetooth 'wire'  and functions allowing to decode these frames.

 

The next challenge was to be able to use the Sphero not for its motricity skills but for data coming from its internal sensors (accelerometers, gyroscopes, etc...) and its capability to continuously stream their value. From our point of view, the most important data are coming from its IMU (Inertial Measurement Unit). This sensor allows knowing the position of the Sphero according to its origin position (automatically set when the Sphero starts). 

 

In order to validate our LabVIEW driver and IMU's signal decoding, we developed this very light interface allowing to control two 3D 'eyes' according to Sphero position.

 

 

AR.Drone:

To start controlling the AR.Drone using LabVIEW we have downloaded a LabVIEW driver from LabVIEW Hacker website. To better understand how this driver was working and also what we could do with the drone, we've decided to code a LabVIEW gamepad allowing to control the AR.Drone using our computer keyboard.

Gamepad

The drone communicates over TCP/IP layer which natively supported by LabVIEW. Control orders are regularly sent to the drone, and data coming from its sensors are received asynchronously. Indeed, when the drone is stable, data are sent every # 100 milliseconds ; when it's moving, this period can go below 10 millisecondes.

First tests with the drone :

Combining ARDrone and Sphero

Sphero communicates using a Bluetooth connection, but the ARDrone uses a wifi link. A computer is used to create the bridge between the two devices. A program written using LabVIEW allows the communication between the Sphero and the drone. It is also in charge of correcting data received from the Sphero to send correct orders to the drone and to display relevant informations taken from the devices' sensors.

dronesphero4

Interaction between Sphero and the ARDrone can be decomposed in 6 steps :

  1. Control Sphero alignment according to user position (initialisation step, executed only once when launching the program)
  2. Acquire data from Sphero's sensors (IMU, collision, accelerations)
  3. Data processing to convert Sphero's position and state into orders for the ARDrone (Pitch, Roll, Yaw, Take Off / Landing)
  4. Send orders to the ARDrone
  5. Acquire data from ARDrone's sensors (linear speed, altitude, position)
  6. Display data acquired from both devices, animate the ARDrone 3D model, display video from the drone's camera

The software architecture used to achieve this project allows the decorrelation of ARDrone, Sphero and display control. Each part of these parts are launched in their own thread (process), communication between is ensured using events. Each process is then independant, only the events registration link them together.

 GeneralSoftwareArchitecture EN

 

 

Demonstration at NI Days 2014: