1. Function Description
Install a camera on the R216a prototype. The example in this article will realize the function of the two-wheeled car autonomously looking for the color target through the two-degree-of-freedom platform.
2. Structure Description
The R216a prototype is mainly composed of a two-wheel trolley and a 2-degree-of-freedom head .
3. Electronic hardware
In this example, we use the following hardware, please refer to:
Main Control Board |
Basra main control board (compatible with Arduino Uno)? |
Expansion Board |
Bigfish2.1 expansion board? |
Battery | 7.4V lithium battery |
communication | 2510 communication adapter board |
WiFi router | |
Other | Camera |
Configure the Visual Studio 2015.net environment of OpenCV A computer |
Circuit Connection Instructions:
① Connect the 2510 communication adapter board to the docking station of the Bigfish expansion board;
② Use 3 female-to-female DuPont cables to connect the 2510 communication adapter board to the WiFi router: GND-GND, RX-RX, TX-TX;
③ Find a USB cable, connect one end to the interface of the 2510 communication adapter board, and connect the other end to the USB interface of the WiFi router;
④ Connect the camera cable to the WiFi router interface.
4. Function realization
Implementation idea: Realize the tracking of the blue ball by the two-wheel gimbal car.
4.1 How it works
① The camera collects image information;
② Pass the information to the PC via WiFi (OpenCV environment configured by VS2015);
③ Use OpenCV’s target color tracking camshift algorithm to obtain the coordinates of the center point of the target object;
④ Segment the camera display image by using the Jiugongge method;
⑤ Determine the position of the target object in the Jiugongge of the displayed image;
⑥ If the target image exceeds the center of the Jiugongge position, adjust the camera to correct the offset so that the target object is at the center of the screen;
⑦ To adjust the camera, the upper computer needs to send correction instructions to the lower computer through WiFi, and the lower computer needs to receive the signal, and let the smart car installed with the camera make corresponding correction actions.
If the target is lost, the gimbal on the smart car will turn to find the target.
4.2 Sample program
Programming environment: Arduino 1.8.19
① Lower computer routine
Download the reference routine (example.ino) to the main control board, turn on the router, and after the router is started, connect the router to the TX and RX serial ports of the main control board, and connect the PC to the router WIFI network. The lower computer receives the image information processed by the upper computer to control the corresponding movement of the pan-tilt, and the pan-tilt follows the movement of the target object.
/*--------------------------------------------- --------------------------------------- Copyright Note: Copyright 2023 Robottime(Beijing) Technology Co., Ltd. All Rights Reserved. Distributed under MIT license. See file LICENSE for detail or copy at https://opensource.org/licenses/MIT by Robotway 2023-04-24 https://www.robotway.com/ ------------------------------*/ /* car_exapmle 2018/06/07 --------------------- Motor: left: 9,5 right: 10,6 Servo: bottom: 4 top: 7 */ #include <Servo.h> #define BOTTOM_SERVO_MIN 60 #define BOTTOM_SERVO_MAX 150 //175 #define BOTTOM_SERVO_MIDDLE 85 #define TOP_SERVO_MIN 85 #define TOP_SERVO_MAX 175 const String CMD_LEFT = "L"; const String CMD_RIGHT = "R"; const String CMD_STOP = "S"; const String CMD_FORWARD = "F"; const String CMD_LOST = "N"; Servo myServo[2]; int port0 = 4; int port1 = 7; int servo_value[2] = {85, 105}; int servo_move_angle = 1; void setup() { Serial.begin(9600); pinMode(5,OUTPUT); pinMode(6,OUTPUT); pinMode(9,OUTPUT); pinMode(10,OUTPUT); ServoInit(); delay(1000); } void loop() { String data = SerialRead(); if(data == CMD_STOP) { Stop(); } else if(data == CMD_FORWARD) { Forward(); } else if(data == CMD_LEFT) { Left(); delay(60); Stop(); } else if(data == CMD_RIGHT) { Right(); delay(60); Stop(); } else if(data == CMD_LOST) { FindObject(); } } String SerialRead() { String str = ""; while(Serial. available()) { str + = char(Serial. read()); } return str; } int Angle2Pwm(int n) { return map(n, 0, 180, 500, 2500); } void ServoInit() { myServo[0].attach(port0); myServo[1].attach(port1); for(int i=0;i<2;i++) { myServo[i].write(Angle2Pwm(servo_value[i])); } } void FindObject() { const int times = 30; int dir = 0; for(;;) { String data = SerialRead(); if(data != CMD_LOST & amp; & amp; data != "") { if(servo_value[0] <= BOTTOM_SERVO_MIDDLE) { dir = 1; // turn left for(int i=0;i<abs(servo_value[0] - BOTTOM_SERVO_MIDDLE);i ++ ) { servo_value[0] + = 1; myServo[0].write(Angle2Pwm(servo_value[0])); delay(times); } } else if(servo_value[0] > BOTTOM_SERVO_MIDDLE) { dir = 2; // turn right for(int i=0;i<abs(servo_value[0] - BOTTOM_SERVO_MIDDLE);i ++ ) { servo_value[0] -= 1; myServo[0].write(Angle2Pwm(servo_value[0])); delay(times); } } break; } if(servo_value[0] <= BOTTOM_SERVO_MIN) { servo_move_angle = 1; servo_value[0] = BOTTOM_SERVO_MIN; } else if(servo_value[0] >= BOTTOM_SERVO_MAX) { servo_move_angle = -1; servo_value[0] = BOTTOM_SERVO_MAX; } servo_value[0] + = servo_move_angle; myServo[0].write(Angle2Pwm(servo_value[0])); delay(times); } if(dir == 1) { Left(); delay(500); Stop(); } else if(dir == 2) { Right(); delay(500); Stop(); } } void Forward() { analogWrite(5,120); analogWrite(6,0); analogWrite(9,120); analogWrite(10,0); } void Left() { analogWrite(5,105); analogWrite(6,0); analogWrite(9,0); analogWrite(10,105); } void Right() { analogWrite(5,0); analogWrite(6,105); analogWrite(9,105); analogWrite(10,0); } void Stop() { digitalWrite(5,HIGH); digitalWrite(6,HIGH); digitalWrite(9,HIGH); digitalWrite(10,HIGH); }
② Host computer routine
The following provides a reference routine (MainWindow.xaml.cs) that can realize the tracking of the blue ball by the two-wheeled smart car. You can refer to the demonstration video to complete the experiment.
using System; using System.Collections.Generic; using System. Linq; using System. Text; using System. Threading. Tasks; using System. Windows; using System. Windows. Controls; using System. Windows. Data; using System. Windows. Documents; using System. Windows. Input; using System.Windows.Media; using System.Windows.Media.Imaging; using System. Windows. Navigation; using System. Windows. Shapes; using System. Windows. Forms; using System.Runtime.InteropServices; using System. Threading; using System.Net; using System.Net.Sockets; namespace Project { /// <summary> /// PTZ tracking /// </summary> public partial class MainWindow : Window { //Import camshift.dll dynamic link library [DllImport("Camshift_DLL.dll", CharSet = CharSet.Ansi)] public static extern void Camshift([MarshalAs(UnmanagedType.LPStr)]string ip_address, //picture or video address ref int xpos, // X coordinate of the detection frame center ref int ypos, // Y coordinate of the detection frame center ref int td); //Diagonal length of detection area //Define the window size int cap_w = 320, cap_h = 240; //Track the x, y coordinates of the center of the object int x = 0, y = 0, d = 0; //Define command variables string CMD_FORWARD = "", CMD_TURN_LEFT = "", CMD_TURN_RIGHT = "", CMD_STOP = "", CMD_LOST = ""; //struct public struct Boundaries { public int x_left; public int x_right; public int y_up; public int y_down; public int d_min; public int d_max; } Boundaries boundaries = new Boundaries(); public MainWindow() { InitializeComponent(); } private void Window_Loaded(object sender, RoutedEventArgs e) { GetIni(); SetPosition(); CmdInit(); StructInit(); } // variable initialization private void CmdInit() { CMD_FORWARD = "F"; CMD_TURN_LEFT = "L"; CMD_TURN_RIGHT = "R"; CMD_STOP = "S"; CMD_LOST = "N"; } //struct initialization private void StructInit() { boundaries.x_left = 120; boundaries.x_right = 240; boundaries.y_up = 80; boundaries.y_down = 160; boundaries.d_min = 50; boundaries.d_max = 150; } // Get ini configuration file information private void GetIni() { ini_RW.FileName = System.Windows.Forms.Application.StartupPath + "\Config.ini"; this.videoAddress.Text = ini_RW.ReadIni("VideoUrl", "videourl", ""); this.ipAddress.Text = ini_RW.ReadIni("ControlUrl", "controlUrl", ""); this.portBox.Text = ini_RW.ReadIni("ControlPort", "controlPort", ""); } //Change setting private void setBtn_Click(object sender, RoutedEventArgs e) { ini_RW.WriteIni("VideoUrl", "videourl", this.videoAddress.Text); ini_RW.WriteIni("ControlUrl", "controlUrl", this.ipAddress.Text); ini_RW.WriteIni("ControlPort", "controlPort", this.portBox.Text); System.Windows.MessageBox.Show("The configuration is successful! Please restart the program to make the configuration take effect.", "Configuration Information", MessageBoxButton.OK, MessageBoxImage. Information); //this.Close(); } //command sending function void SendData(string data) { try { IPAddress ips = IPAddress.Parse(ipAddress.Text.ToString());//("192.168.8.1"); IPEndPoint ipe = new IPEndPoint(ips, Convert.ToInt32(portBox.Text.ToString()));//Convert ip and port to IPEndPoint instance Socket c = new Socket(AddressFamily.InterNetwork, SocketType.Stream, ProtocolType.Tcp);//Create a Socket c.Connect(ipe);//Connect to the server byte[] bs = Encoding.ASCII.GetBytes(data); c.Send(bs, bs.Length, 0);//Send test information c.Close(); } catch (Exception e) { System.Windows.Forms.MessageBox.Show(e.Message); } } //Tracking object position limit judgment private void LineDetect(int _x, int _y, int _d) { try { //Determine the direction of rotation if (_x > 0 & & _x <= boundaries.x_left) { SendData(CMD_TURN_LEFT); } else if (x > boundaries. x_right & amp; & amp; x < cap_w) { SendData(CMD_TURN_RIGHT); } //Determine whether to move forward else if (_d > boundaries.d_min & amp; & amp; _d < boundaries.d_max) { SendData(CMD_FORWARD); } else if ((_x > boundaries.x_left & amp; & amp; _x < boundaries.x_right) & amp; & amp; (_d >= 0 & & _d <= boundaries.d_min )) { SendData(CMD_LOST); } else { SendData(CMD_STOP); } } catch { }; } // object position initialization private void SetPosition() { var color = new SolidColorBrush((System.Windows.Media.Color)System.Windows.Media.ColorConverter.ConvertFromString("#FFACAAAA")); objEllipse. Height = 30; objEllipse. Width = 30; objEllipse. Fill = color; var left_distance = (cap_w - objEllipse. Width) / 2; var top_distance = (cap_h - objEllipse. Height) / 2; Canvas. SetLeft(objEllipse, left_distance); Canvas. SetTop(objEllipse, top_distance); } //Tracking object position update function private void PositionUpdate(int x, int y, int d) { LineDetect(x, y, d); Canvas. SetLeft(objEllipse, x); Canvas. SetTop(objEllipse, y); posLable.Content = x + " , " + y + " , " + d; } //thread function private void ThreadCapShow() { try { while (true) { this.Dispatcher.Invoke( new Action( delegate { string ip = this.videoAddress.Text; Camshift(ip, ref x, ref y, ref d); PositionUpdate(x - 15, y - 15, d); } )); } } catch { }; } //Open the trace window private void openBtn_Click(object sender, RoutedEventArgs e) { try { Thread m_thread = new Thread(ThreadCapShow); m_thread.IsBackground = true; m_thread. Start(); } catch { }; } } }
For details about the source code of the routine and the 3D file of the prototype, see Two-wheel gimbal car-tracking color targets