1. Trang chủ
  2. » Luận Văn - Báo Cáo

Research, design and construct a quadcopter for searching accidents in the outdoor environment

144 48 1

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Định dạng
Số trang 144
Dung lượng 7,49 MB
File đính kèm code.rar (36 KB)

Các công cụ chuyển đổi và chỉnh sửa cho tài liệu này

Nội dung

HO CHI MINH CITY UNIVERSITY OF TECHNOLOGY AND EDUCATION FACULTY OF ELECTRICAL AND ELECTRONICS ENGINEERING DEPARTMENT OF AUTOMATIC CONTROL ***** GRADUATION THESIS Research, Design and

Trang 1

HO CHI MINH CITY UNIVERSITY OF TECHNOLOGY AND EDUCATION

FACULTY OF ELECTRICAL AND ELECTRONICS ENGINEERING

DEPARTMENT OF AUTOMATIC CONTROL

*****

GRADUATION THESIS

Research, Design and Construct a Quadcopter for Searching Accidents

in the Outdoor Environment

Students: Nguyen Thanh Trung – 15151236

Trang 2

HCMC UNIVERSITY OF TECHNOLOGY AND EDUCATION

FACULTY OF ELECTRICAL AND ELECTRONICS

ENGINEERING

DEPARTMENT OF AUTOMATIC CONTROL

SOCIALIST REPUBLIC OF VIETNAM Independence – Freedom – Happiness

o0o

Ho Chi Minh City, Jul 2nd, 2019

THESIS ASSIGNMENTS

Student 1: Nguyễn Thành Trung – Student ID: 15151236

Student 2: Trần Ngọc Khanh – Student ID:15151163

Major: Automation and Control Engineering Technology

Training system: Formal Training System

Academic year: 2015 – Class: 151511A

- Condition environment: indoor and outdoor

- Flight time: > 10mins

- Control limitation: > 100m

2 Implementation Content:

- Build model of quadcopter with camera

- Build a flight control algorithm based on the PID controller and tune PID

- Build landing algorithm with image processing

- Collect data

- Write thesis

III ASSIGNED DATE: March 22nd, 2019

IV COMPLETE DATE: June 30th, 2019

V ADVISOR: My-Ha Le, Ph.D

My-Ha Le, Ph.D

Trang 3

HCMC UNIVERSITY OF TECHNOLOGY AND EDUCATION

FACULTY OF ELECTRICAL AND ELECTRONICS

ENGINEERING

DEPARTMENT OF AUTOMATIC CONTROL

SOCIALIST REPUBLIC OF VIETNAM Independence – Freedom – Happiness

o0o

Ho Chi Minh City, Jul 2nd, 2019

PROJECT IMPLEMENTATION SCHEDULE

Student 1: Nguyen Thanh Trung – Student ID: 15151236

Student 2: Tran Ngoc Khanh – Student ID:15151163

Major: Automation and Control Engineering Technology

Thesis: Research, Design and Construct a Quadcopter

for Searching Accidents in the Outdoor Environment

Advisor: My-Ha Le, Ph.D

confirm

Mar 22nd – 24th, 2019 - Survey and selection topic

Mar 25th – 31st, 2019 - Build model of quadcopter for flight

Apr 1st – 7th, 2019 - Test and fix model of quadcopter for

flight

Apr 8th – 14th, 2019 - Build and program for controller

Apr 15th – 21st, 2019 - Test flight with controller transmitter

and tune PID

Apr 22nd – 30th, 2019 - Upgrade with ultrasonic sensor

HC-SR04

May 1st – 5th, 2019 - Test and debug code with new flight

code for keep the height

May 6th – 12th, 2019 - Upgrade with GPS NEO-6M and

transmit GPS signal

May 13th – 19th, 2019 - Test camera with Raspberry Pi 3 (how

to connect, how to work)

Trang 4

May 20th – 31st, 2019 - Research and study about Machine

learning, AI, image processing;

- Comparing and choose a suitable model with our task and hardware;

- Generation data for training

Jun 1st – 9th, 2019 - Implement training the data;

- Debug code detection and recognize

Jun 10th – 16th, 2019 - Connection Arduino Uno and

Raspberry Pi 3 Build the code

Jun 17th – 23th, 2019 - Combination and correction of the

program

- Final test and recording video, image

Jun 24th – 30th, 2019 - Complete graduate thesis

ADVISOR

My-Ha Le, Ph.D.

Trang 5

HCMC UNIVERSITY OF TECHNOLOGY AND EDUCATION

FACULTY OF ELECTRICAL AND ELECTRONICS

ENGINEERING

DEPARTMENT OF AUTOMATIC CONTROL

SOCIALIST REPUBLIC OF VIETNAM Independence – Freedom – Happiness

o0o

Ho Chi Minh City, Jul 2nd, 2019

ADVISOR’S COMMENT SHEET

Student 1: Nguyen Thanh Trung – Student ID: 15151236

Student 2: Tran Ngoc Khanh – Student ID:15151163

Major: Automation and Control Engineering Technology

Thesis: Research, Design and Construct a Quadcopter

for Searching Accidents in the Outdoor Environment

Advisor: My-Ha Le, Ph.D

COMMENT

1 About the thesis’s contents:

- Students have completed the requirements of graduation thesis

2 Advantage:

- The system works stably in real time

- The system detects accident with high accuracy

3 Disadvantage:

- Need to experiment on more diverse data

4 Propose defending thesis?

- Yes

5 Rating:

- Excellent

6 Mark: 9.8/10 (In writing: nine point ten)

Ho Chi Minh City, July 4th 2019

ADVISOR

My-Ha Le, Ph.D

Trang 8

HCMC UNIVERSITY OF TECHNOLOGY AND EDUCATION

FACULTY OF ELECTRICAL AND ELECTRONICS

ENGINEERING

DEPARTMENT OF AUTOMATIC CONTROL

SOCIALIST REPUBLIC OF VIETNAM Independence – Freedom – Happiness

Trang 9

ACKNOWLEDGMENT

We finally completed my final project the end of my education in Ho Chi Minh City University of Technology and Education To be honest, it is not perfect but we have tried whole us term to doing this project better day by day

Up to now, we place on record our sincere thankful to Faculty of Electrical and Electronics Engineering and Department of Automatic Control supporting us the thing we need and allowing us to execute this project

We wish to express our sincere thankful to My-Ha Le, Ph.D giving, instructing and supporting us then research this project

We are grateful to Do Truong Dong, Duong Minh Thien, Tran Le Anh, Le Tien

Sy, Nguyen Trung Hieu, Ngo Tran Khanh Dang, Vo Minh Cong, Dang Quoc Vu, Nguyen Duy Thong, for providing us the necessary facilities for the research

In this term of us, we have friends, who are Le Manh Cuong, Dao Duy Phuong, Phan Vo Thanh Lam, Vo Anh Quoc, Tran Van Son, Duong Thuy Huynh We do the final project together in the laboratory They have supported too much such as telling

us about this/that is not good or good, on processing helped me collecting data and ideas which help us better and developing us project day by day

Specially, we also thank our parents, family and friends for the encouragement, support and attention

Trang 10

ABSTRACT

In this report, we explained the theory of quadcopter dynamics, image processing theory, flight control theory and principles of operation of the circuit board used

Furthermore, we have focused on machine learning, artificial intelligence It is

a comprehensive overview of using deep learning-based object detection methods for aerial image via Quadcopter

The result of our project is target detection and positioning system and aerial image collection are developed and integrated into Quadcopter

Base on the results obtained from reality, we assess and propose future directions of development

Trang 11

CONTENTS

THESIS ASSIGNMENTS i

PROJECT IMPLEMENTATION SCHEDULE ii

ADVISOR’S COMMENT SHEET iv

REVIEWER’S COMMENT SHEET……… v

COMMITMENT vii

ACKNOWLEDGMENT viii

ABSTRACT ix

CONTENTS x

LIST OF FIGURES xv

LIST OF TABLES xx

LIST OF ABBREVIATIONS xxi

CHAPTER 1: OVERVIEW 1

1.1 Motivation 1

1.2 Objectives of the thesis 1

1.3 Related works 2

1.3.1 International research 2

1.3.1.1 UAV Market 2

1.3.1.2 Rescue of Quadcopter 3

1.3.1.3 Hydrogen-Powered Drone Be the Future of Transportation 4

1.3.1.4 Follow Me drone Recognition Technology 5

1.3.1.5 Mind-controller drone 5

1.3.2 Domestic researches 6

1.3.2.1 Domestic student researches 6

1.3.2.2 Military-drone researches: 7

1.4 Limitation and propose a new algorithm 8

1.4.1 Propose a new algorithm 8

1.4.2 Limitation 9

1.5 Contents 9

CHAPTER 2: THEORY BASIS OF QUADCOPTER FOR SEARCHING ACCIDENTS 11

2.1 Flight Control Theory 11

2.1.1 Basic Motion 11

2.1.2 Six Degrees of Freedom (6DOF) 12

2.1.3 Representing Orientation 13

2.1.4 Euler Angles 14

Trang 12

2.1.5 Describe Motion 15

2.1.5.1 Simultaneous Translation and Rotation – The Chain Rule 15

2.1.5.2 Newton’s Second Law of Motion 16

2.1.5.3 Force, Mass, and Acceleration 16

2.1.5.4 Propeller Thrust 17

2.1.5.5 Gravity Force in Body Coordinates 18

2.1.5.6 Mass and Moment of Inertia 19

2.1.5.7 Acceleration 20

2.1.5.8 Angular Acceleration and Rotational Motion 21

2.1.5.9 Angular Velocity vs Euler Angle Rates 21

2.1.5.10 The Navigation Coordinates 22

2.1.5.11 The Quadcopter Equations of Motion 23

2.1.6 Model of aerodynamic calculation 24

2.2 Inertial Measurement Unit Systems 24

2.2.1 Overview 24

2.2.2 Main Types of Motion Sensor Devices 26

2.2.2.1 Accelerometer 26

2.2.2.1.1 Overview 26

2.2.2.1.2 MEMS accelerometer 28

2.2.2.2 Gyroscope 28

2.2.2.2.1 Overview 28

2.2.2.2.2 MEMS Gyroscope 29

2.2.3 Noise Processing 30

2.4 PID Controller 32

2.4.1 Overview 32

2.4.2 Proportional term 34

2.4.3 Integral Term 35

2.4.4 Derivative Term 36

2.4.5 Tuning PID Controller 37

2.5 Artificial Intelligence 38

2.5.1 Types of Artificial Intelligence 39

2.5.2 Machine learning 40

2.5.2.1 Overview 40

2.5.2.2 The detail about some of these components: 43

2.5.3 Deep Learning 44

2.6 Convolutional neural network 45

2.6.1 Overview 45

Trang 13

2.6.2 Structure of convolutional neural network 46

2.6.2.1 Convolutional layer 47

2.6.2.2 Non-linearity 49

2.6.2.3 Stride and Padding 50

2.6.2.4 Pooling layer 52

2.6.2.5 Flattening layer 54

2.6.2.6 Fully-Connected layer 54

2.6.2.7 Softmax 55

2.6.3 Comparison CNN Models 57

CHAPTER 3: DESIGN AND BUILD THE HARDWARE 58

3.1 Hardware Requirements 58

3.2 Block Diagram 58

3.3 Hardware Introduction 58

3.3.1 Frame and PCB 58

3.3.1.1 Overview 58

3.3.1.2 F450 Flame Wheel 59

3.3.2 Brushless Direct Current Motor (BLDC Motor) 60

3.3.2.1 Overview 60

3.3.2.2 Structure 60

3.3.2.3 Operational motor theory and control principle 62

3.3.2.3.1 Operational motor theory 62

3.3.2.3.2 Controlling principle 63

3.3.2.4 BLDC motor type A2212/13T 1000KV 65

3.3.4 Electronic Speed Control (ESC) 65

3.3.4.1 Overview 65

3.3.4.2 Structure 66

3.3.4.3 Hobbywing SkyWalker 40A ESC 67

3.3.5 Arduino Microcontroller 68

3.3.5.1 Overview 68

3.3.5.2 Arduino Uno R3 68

3.3.5.3 Arduino Nano 70

3.3.6 Accelerometer and Gyroscope 72

3.3.6.1 Overview 72

3.3.6.2 MPU-6050 GY-521 72

3.3.7 Raspberry Pi 73

3.3.7.1 Overview 73

3.3.7.2 Raspberry Pi 3 Model B 73

Trang 14

3.3.8 Radio Frequency - Transmitter and Receiver System 74

3.3.8.1 Overview 74

3.3.8.2 Flight Control Transmitter and Receiver 75

3.3.8.2 NRF24L01 + PA + LNA 2.4Ghz Transceiver 76

3.3.9 LiPo Battery 77

3.3.9.1 Overview 77

3.3.9.2 Parameters of LiPo battery 78

3.3.9.2.1 Voltage 78

3.3.9.2.2 Cell Count (S rating) 78

3.3.9.2.3 Capacity 79

3.3.9.2.4 Discharge Rating ("C" Rating) 79

3.3.9.5 Infinity LiPo Battery 79

3.3.10 Propeller 80

3.3.10.1 Overview 80

3.3.10.2 FC 9045 Propeller 81

3.3.11 Webcam 82

3.3.11.1 Overview 82

3.3.11.2 Webcam Logitech C170 83

3.3.12 Ultrasonic sensor 84

3.3.12.1 Overview 84

3.3.12.2 Ultrasonic sensor HC-SR04 84

3.3.13 Global Positioning System 85

3.3.13.1 Overview 85

3.3.13.2 Ublox NEO 6M GPS Module 86

3.4 Diagram of Hardware Connection 87

3.4.1 GPS Receiver and Warning Station 87

3.4.2 Quadcopter Model 88

CHAPTER 4: CONTROL ALGORITHMS 89

4.1 Network Architecture 89

4.1.1 MobileNet-SSD Model 89

4.1.2 MobileNet as Feature Extraction 91

4.1.3 SSD Meta Structure 95

4.2 PID Controllers for Quadcopter Balancing 97

4.4 Flow Charts 98

CHAPTER 5: EXPERIMENTS 100

5.1 The Hardware Experiments Results 100

5.2 Experimental Environments 102

Trang 15

5.3 Data Description 103

5.4 Data Augmentations 104

5.5 Training processing 105

5.6 Compare MobileNet-SSD Model 109

5.7 Flight and Detection Results 110

5.7.1 Flight Results 110

5.7.1.1 Indoor Results 110

5.7.1.2 Outdoor Results 110

5.7.2 Detection Results 111

5.7.2.1 Indoor Results 111

5.7.2.2 Outdoor Results 112

5.7.2.2.1 Daylight Environment 112

5.7.2.2.2 Evening Light Environment 114

5.7.2.2.3 The Object Obscured and Lack of Light 115

5.8 GPS Location Results 116

CHAPTER 6: CONCLUSIONS AND DEVELOPMENTS 118

6.1 Conclusions 118

6.1.1 Achieved Results 118

6.1.2 Advantaged 118

6.1.2 Disadvantaged 119

6.2 Development Directions 119

REFERENCES 120

Trang 16

LIST OF FIGURES

Figure 1.1: UAV market by Region 2025 3

Figure 1.2: Traverse of a narrow gap to enter a collapsed building 3

Figure 1.3: Hydrogen-Powered Drone model 4

Figure 1.4: SIMTOO Follow Me Tracking Drone 4K 5

Figure 1.5: Brain-controlled drone students at the University of Florida 6

Figure 1.6: The drone is researched by HCMUTE students 7

Figure 1.7: The drone is researched by HCMUTE students 7

Figure 1.8: The graduate thesis drone of HCMUTE students 7

Figure 1.9: MD4-1000 is used in Vietnamese Army 8

Figure 1.10: Propose an algorithm 9

Figure 2.1: The basic motion of quadcopter 11

Figure 2.2: Rigid body and Flexible body 12

Figure 2.3: Inertial frame and Body frame 13

Figure 2.4: The idea of Euler angles 14

Figure 2.5: The thrust force is represented for each propellers 18

Figure 2.6: Euler Angle Rates to Angular Velocity 21

Figure 2.7: Roll, Pitch, Yaw angles 25

Figure 2.8: Acceleration of a solid ball in space 27

Figure 2.9: Acceleration of the ball in two axes x and z 27

Figure 2.10: MEMS accelerometer 28

Figure 2.11: Microelectromechanical Systems Gyro 29

Figure 2.12: Coriolis effect 29

Figure 2.13: Complementary Filter 32

Figure 2.14: Close loop control scheme of PID controller 32

Figure 2.15: Response of a typical PID closed loop system 33

Figure 2.16: The proportional term 34

Trang 17

Figure 2.17: Response of PV to step change of SP vs time, for three values of 𝐾𝑝 35

Figure 2.18: The integral term 35

Figure 2.19: Response of PV to step change of SP vs time, for three values of 𝐾𝑖 36 Figure 2.20: The derivative term 36

Figure 2.21: Response of PV to step change of SP vs time, for three values of 𝐾𝑑36 Figure 2.22: Effects of increasing a parameter independently 38

Figure 2.23: Results of PID tuning 38

Figure 2.24: Types of Artificial Intelligence 39

Figure 2.25: Relation Between AI, Machine Learning 41

Figure 2.26: Structure of a biological neuron 42

Figure 2.27: Structure of artificial neurons 43

Figure 2.28: Real-life images 46

Figure 2.29: ConvNets being used for recognizing 46

Figure 2.30: CNN architecture 46

Figure 2.31: The input and filter of a CNN 47

Figure 2.32: The convolutional operation of a CNN 47

Figure 2.33: The result of a convolution operation 48

Figure 2.34: Convolution 2D with 1 Kernel 48

Figure 2.35: 3D Convolution 49

Figure 2.36: Convolution 2D with several kernels 49

Figure 2.37: The feature map after applying the rectifier function 50

Figure 2.38: Resulting feature map when implement 3 x 3 convolution 51

chose a stride of 2 and padding of 1 51

Figure 2.39: Apply zero Padding for input image 51

Figure 2.40: Convolution – 1 kernel, stride 1, padding 1 52

Figure 3.41: Pooling layer before and after 52

Figure 2.42: Illustration of max pooling drawback 53

Figure 2.43: Illustration of average pooling drawback 53

Figure 2.44: A fully connected layer in a deep network 55

Trang 18

Figure 2.45: A multilayer deep fully connected network 55

Figure 2.46: Softmax function 56

Figure 2.47: Softmax function result 56

Figure 2.48: Comparison CNN Models 57

Figure 3.1: Block diagram 58

Figure 3.2: F450 4-Axis PCB Quadcopter Frame with motors and propellers 59

Figure 3.3: Details of F450 59

Figure 3.4: Outrunner rotor 61

Figure 3.5: Inrunner rotor 61

Figure 3.6: Structure of the basic BLDC motor for drone 62

Figure 3.7: Motor Rotation 62

Figure 3.8: Working principal with Hall position sensor 63

Figure 3.9: Working principle without sensor 64

Figure 3.10: A2212/13T 1000KV Brushless Motor 65

Figure 3.11: Structure diagram of basic ESC 66

Figure 3.12: Hobbywing SkyWalker 40A ESC 67

Figure 3.13: Arduino Uno R3 69

Figure 3.14: Arduino Uno pinout diagram 70

Figure 3.15: Arduino Nano 70

Figure 3.16: Arduino Nano pinout diagram 71

Figure 3.17: GY-521 MPU-6050 Module 72

Figure 3.18: Raspberry Pi 3 model B 74

Figure 3.19: Devo 7 Transmitter 75

Figure 3.20: Receiver RX701 76

Figure 3.21: NRF24L01+PA+LNA 2.4GHz Wireless Transceiver Module 77

Figure 3.23: Infinity LiPo Battery 79

Figure 3.24: Size of propeller 80

Figure 3.25: The types of the propeller 81

Figure 3.26: FC 9045 Propeller 82

Trang 19

Figure 3.27: Webcam Logitech C170 83

Figure 3.28: Timing diagram of ultrasonic sensor HC-SR04 84

Figure 3.29: Ultrasonic sensor HC-SR04 85

Figure 3.30: Ublox NEO 6M GPS Module 86

Figure 3.31: Diagram of GPS Receiver Station Connection 87

Figure 3.32: Diagram of Hardware Connection 88

Figure 4.1: MobileNet-Single Shot MultiBox Detector (SSD) network feature pyramid 90

Figure 4.2: mobileNet-SSD network architecture 90

Figure 4.3: Components of the mobeilnet base network 92

Figure 4.4: Standard Convolution Filters 93

Figure 4.5: Depthwise Convolutional Filters 94

Figure 4.6: 1x1 Convolutional Filters called Pointwise Convolution in the context of Depthwise Separable Convolution 94

Figure 4.7: Illustration of intersection over union 96

Figure 4.10: Default boxes on feature maps 97

Figure 4.11: Yaw, Pitch, Roll Angles 97

Figure 4.12: The block diagram of the system 98

Figure 4.13: The overall block diagram of the system 98

Figure 4.14: Flowchart of the quadcopter model 99

Figure 5.1: Quadcopter Model 101

Figure 5.2: The object use to simulate human 101

Figure 5.3: GPS station 102

Figure 5.4: Daylight environment 102

Figure 5.5: Evening light environment 103

Figure 5.6: Samples typical images in the dataset 103

Figure 5.7: Typical Collected Dataset 104

Figure 5.8: Data augmentation examples 105

Figure 5.9: Creation label for object 106

Trang 20

Figure 5.10: Run command from Ubuntu Prompt 107

Figure 5.11: Total loss value 107

Figure 5.12: Classification loss value 108

Figure 5.13: Localization loss value 108

Figure 5.14: The result of the traning 108

Figure 5.15: Compare with HMM and SVM-HMM 109

Figure 5.16: Compare with MobileNet only 109

Figure 5.17: Indoor flight 110

Figure 5.18: Manual flight control 110

Figure 5.19: Daylight flight results 110

Figure 5.20: Evening light flight results 111

Figure 5.21: Typical Indoor Detection 111

Figure 5.22: Detectable the object obscured less than 50% 112

Figure 5.23: Undetectable the object obscured more than 50% 112

Figure 5.24: Typical daylight detection results on the road 113

Figure 5.25: Typical daylight detection results on the grass 113

Figure 5.26: The Sore confidence around 97% 113

Figure 5.27: Typical evening light detection results 114

Figure 5.28: The Score confidence around 69 % 114

Figure 5.29: Detectable the object obscured less than 50% 115

Figure 5.30: Undetectable the object obscured more than 50% 115

Figure 5.31: Undetectable the object 115

Figure 5.32: GPS Station 116

Figure 5.31: Typical Map of Fight Route and Detected Location 116

Trang 21

LIST OF TABLES

Table 3.1: Specification of A2212/13T 1000KV Brushless Motor 65

Table 3.2: Control status of basic Electronic Speed Control 67

Table 3.3: Specification of Hobbywing SkyWalker 40A ESC 67

Table 3.4: Specification of Arduino Uno R3 69

Table 3.5: Specification of Arduino Nano 71

Table 3.6: Specification of GY-521 MPU-6050 72

Table 3.8: Specification of Devo 7 Transmitter 76

Table 3.9: Specification of Receiver RX701 76

Table 3.10: Specification of NRF24L01+PA+LNA 2.4GHz 77

Table 3.11: Specification of Infinity LiPo Battery 80

Table 3.12: Specification of propeller 82

Table 3.13: Specification of Webcam Logitech C170 83

Table 3.14: Specification of Ultrasonic sensor HC-SR04 85

Table 3.15: Specification of the Ublox NEO 6M GPS Module 87

Table 4.1: Particular mobileNet-SSD network architecture 91

Table 4.2: Parameter of the output convolution 94

Trang 22

LIST OF ABBREVIATIONS

1 UAV - Unmanned Aerial Vehicle

2 ESC - Electronic Speed Control

3 BLDC - Brushless Direct Current Motor

4 BEMF - Back Electro Magnetic Field pulses

5 PID - Proportional Integral Derivative

6 IMU - Inertial Measurement Unit

7 GPS - Global Positioning System

8 MEMS - Micro Electromechanical System

9 SSD - Single Shot Detector

10 6DOF - Six Degrees of Freedom

Trang 23

CHAPTER 1: OVERVIEW

1.1 Motivation

UAV (Unmanned Arial Vehicle) is defined as a motor vehicle without a driver, uses aerodynamic forces to move In addition to performing military missions, UAVs are also researched and applied to scientific tasks such as volcanic observations, environmental investigations, cave discoveries or commercial applications such as delivery, making films, search and rescue, etc

Furthermore, machine learning has recently been one of the most interesting research topics in Vietnam day by day It had changed the way we look at technology

in the future

As deep learning continues to mature, we can expect to see applications of deep learning in new domains almost every day This brings us to the topic of drones, specifically small drones There are lots of important initiatives happening in navigation, 3D depth extraction, image recognition, object and human tracking, and surveillance

We study and understand the problem of wilderness search and rescue entails performing a wide range of work in complex environments and large regions Given the concerns inherent in large regions due to limited rescue distribution, unmanned aerial vehicle (UAV)-based frameworks are a promising platform for providing aerial imaging and notification for the emergency center

Therefore, the team has delighted in researching and learning things above at the last of us term

1.2 Objectives of the thesis

The thesis concentrated on several key goals:

- Researching into the mechanical design, dynamics, aerodynamics, control circuits, signal processing, interference processing

Trang 24

- Researching and using suitable and compatible components such as sensor, microcontroller, motor, power supply, communication, etc… In detail, this thesis will mainly focus on using Raspberry Pi 3 Model B, Arduino Uno R3,

- Researching and applying PID controller into Quadcopter to make it a good balance

- Combining these modules (GPS- (Global positioning system), Sensors, Arduino, Raspberry, etc ) include hardware and software

- Overview of how deep learning concepts and how it is work to applying us project Partially, combination of MobileNet and the Single Shot Detector (SSD) framework for the fast, efficient deep learning-based method to object detection The target of the project quickly explores an area for wilderness search and rescue missions based on modern technology

1.3 Related works

1.3.1 International research

1.3.1.1 UAV Market

The UAV market is estimated to be USD 20.71 Billion in 2018 and is projected

to reach USD 52.30 Billion by 2025, at a Compound Annual Growth Rate (CAGR)

of 14.15% from 2018 to 2025 Rise in the procurement of military UAVs by defense forces worldwide is one of the most significant factors projected to drive the growth

of the UAV market The increasing use of UAVs in various commercial applications, such as monitoring, surveying and mapping, precision agriculture, aerial remote sensing, and product delivery, is also contributing to the growth of the UAV market Based on the region, the UAV market has been segmented into North America, Europe, Asia Pacific, the Middle East, Latin America, and Africa North America is estimated to be the largest market for UAV in 2018 The increasing use of UAVs for border and maritime surveillance activities in countries, such as US and Canada, is driving the growth of the UAV market in North America [1]

Trang 25

Figure 1.1: UAV market by Region 2025

Today drones are being used in domains such as agriculture, construction, public safety and security to name a few while also rapidly being adopted by others With deep-learning based computer vision now powering these drones, industry experts are predicting unprecedented use in previously unimaginable applications

1.3.1.2 Rescue of Quadcopter

Drones come in all shapes and sizes Now researchers at the University of Zurich and École polytechnique fédérale de Lausanne (EPFL) have made a quadcopter that can change its shape and size in flight

Figure 1.2: Traverse of a narrow gap to enter a collapsed building

Trang 26

The drone was built with first responders and rescue efforts in mind Disaster sites rarely conform to logical shapes and sizes Having a drone that could change its shape and size to fit through tight spaces on the fly could prove extremely valuable

As is often the case in experimental robotic projects, researchers turned to animals for inspiration—specifically how some birds can fold their wings to fly through narrow passage

1.3.1.3 Hydrogen-Powered Drone Be the Future of Transportation

The Skai, which was designed by BMW Group's DesignWorks studio, certainly looks futuristic, spare, and sleek Alaka'i has initiated their test program to get certified with the FAA, and after that they see a lot of possibilities for the Skai: passenger flight, emergency medical response, cargo delivery

Basically most of the things you could do with a helicopter, but presumably without as much noise and with only water as a direct emission (the indirect emissions will depend on the electricity source behind the hydrogen production) Plans call for

a piloted version first, then followed by autonomous models

Figure 1.3: Hydrogen-Powered Drone model

Trang 27

1.3.1.4 Follow Me drone Recognition Technology

Figure 1.4: SIMTOO Follow Me Tracking Drone 4K

Sensors and recognition technology, along with software algorithms give UAVs the ability to recognize and follow a person or object This deep learning following drone technology allows the UAV to track a moving subject without a separate GPS tracker

Figure 1.4 shows the image of SIMTOO Follow Me Tracking drone

1.3.1.5 Mind-controller drone

The user will wear the headset The headset is an electroencephalogram (EEG) device which picks up brain’s electrical impulses through sensors on user’s scalp It records them on a computer and translates those thought patterns into flight instructions for a small drone

Figure 1.5 Brain-controlled drone students at the University of Florida

Trang 28

Figure 1.5: Brain-controlled drone students at the University of Florida

1.3.2 Domestic researches

1.3.2.1 Domestic student researches

Nowadays, many students at some University in Vietnam, who have applied high technology into Quadcopter and how to approaching to it easily

Such as my University of technology and education Ho Chi Minh City, Specifically, IS Laboratory is a place that many students research I saw several generations working with many quadcopter models, which have been selected for research in graduate theses as well as in scientific researches One project has hard technology and new knowledge which is the topic that the world had recently researched, applied as artificially intelligent, deep learning, image processing, etc However, due to a new topic so that requirements of scientific knowledge in many fields and a lack of experience in UAVs research, the drone of student still has

a number of limitations

Figure 1.6 to 1.8 are the student’s quadcopters researched by Ho Chi Minh City University of Technology and Education students

Trang 29

Figure 1.6: The drone is researched by HCMUTE students

Figure 1.7: The drone is researched by HCMUTE students

Figure 1.8: The graduate thesis drone of HCMUTE students

1.3.2.2 Military-drone researches:

At higher levels of research with complex subjects, the studies of UAVs, in particular the quadcopter, have achieved very good results

Trang 30

Figure 1.9: MD4-1000 is used in Vietnamese Army

The complexity of the system is enhanced with a variety of sensors At the forefront of this area are still the Department of Defense research centers with dozens

of different types of unmanned aerial vehicles for military missions, scientific research, aerial observation, Search and Rescue, etc Figure 1.8 shows the image of the MD4-1000

1.4 Limitation and propose a new algorithm

1.4.1 Propose a new algorithm

After surveying and studying object detection methods, we instead used methods as image processing, using sensors to implement and embed quadcopter

hardware

We strongly study the modern methods that the world is studying that are deep

learning to embed into hardware

Finally, we found a suitable neural network and meet the requirements of the

topic, that is model MobileNet - SSD runs on the tensorflow framework

Trang 31

Figure 1.10: Propose an algorithm

1.4.2 Limitation

Due to time and knowledge, we have some limitations:

- Building a realistic quadcopter model

- Velocity of system quadcopter relative stability for the object detection task

- Detected several patterns we trained but not various about the colorful object

- Environment condition: outdoor, flight in good weather conditions, no rain, no wind, and adequate lighting

1.5 Contents

There are six main contents in this graduation thesis:

Chapter 1 - Overview: This chapter presents an overview of the development

status of the UAVs and shows the reasons for the selection, along with the requirements and objectives for the model quadcopter

Chapter 2 – Theory Basis of Quadcopter for Searching Accidents: This

chapter presents the basic theory of flight modeling as a moving mechanism, PID controller, the sensor systems can be used in the model and the theory of image detection

Chapter 3 – Design and Build the Hardware: This chapter presents the

hardware of the model in detail, including mechanical system and electronic circuit

Trang 32

board The mechanical system covers the shape, size and shows how to assemble the components on the model frame The electronic circuit board shows how to connect functional blocks together

Chapter 4 – Control Algorithms: The contents of this chapter include

mathematical modeling, algorithmic and algorithmic flowcharting for each model problem

Chapter 5 – Experiments: This chapter is written after the model has been

fully assembled, operated stably and meets the requirements of this topic

Chapter 6 – Conclusions and Developments: This chapter evaluates the

quality of the quadcopter model, advantages and disadvantages in order to bring out developments in the future

Trang 33

CHAPTER 2: THEORY BASIS OF QUADCOPTER

FOR SEARCHING ACCIDENTS

This chapter presents the basic theory of flight modeling as a moving mechanism, PID controller, the sensor systems can be used in the model and the theory of image processing

2.1 Flight Control Theory

2.1.1 Basic Motion

Quadcopter, an unmanned aerial vehicle (UAV), consists of four engines Four propellers mounted on four motors help to create a lift that allows the quadcopter to fly up when the propellers move The front and back rotors turn counterclockwise, while the right and left others to turn clockwise to balance the torque created by 4 rotors on the frame [1]

Quadcopter control is a fundamentally difficult and interesting problem With six degrees of freedom (three translational and three rotational) and only four independent inputs (rotor speeds), quadcopters are severely under-actuated In order

to achieve six degrees of freedom, rotational and translational motion are coupled The resulting dynamics are highly nonlinear, especially after accounting for the complicated aerodynamic effects

Figure 2.1: The basic motion of quadcopter

Trang 34

The general flight mode for the quadcopter or multi-copter: Move up (Take off); Move down (Landing); Bend right; Bend left; Rotate right; Rotate left; Move forward; Move backward; Fly in some other direction

2.1.2 Six Degrees of Freedom (6DOF)

For this first part in developing the vehicle dynamics we’ll set the stage by describing the space in which we can express these dynamics [2]

For a rigid body in a 3-D world we can describe the location of all points on the vehicle with 6 coordinates

The first 3 should be obvious: these are the (𝑥, 𝑦, 𝑧) coordinates which represent the distance of the object’s center of mass from some origin in the 3-D world The final 3 coordinates represent the orientation of the vehicle represented by three angles: (𝜙, 𝜃, 𝜓)

If we know all 6 coordinates then we can determine the location of the cg from (𝑥, 𝑦, 𝑧) and then from the orientation, or vehicle attitude as it’s also called,

of (𝜙, 𝜃, 𝜓), and we can determine the location of the rest of the vehicle in relation to the cg

Figure 2.2: Rigid body and Flexible body

Trang 35

2.1.3 Representing Orientation

Figure 2.3: Inertial frame and Body frame

Before we go into Euler angles we will first introduce the idea of multiple reference frames For our model, there will be two reference frames (coordinate systems) we will care about:

- Inertial reference frame: This is fixed to the Earth and the coordinates could

be based in cardinal directions (North, East), or arbitrary directions within a room floorplan The main point is that it doesn’t move We will define the inertial frame with the coordinates

- Body-fixed reference frame: Define a coordinate system with its origin at the

cg, and now we have a convenient way to describe how far away the propellers are, and in what direction they face with respect to the cg We will define the body-fixed reference frame with the coordinates and align them with the structure such that b1 and b2 are aligned with symmetric axes of the body and b3 is in the vertical direction (parallel to the propeller motor axes) as shown in Figure 2.3

The goal is to use the two coordinates to separate out translational and rotational motion We’ll keep track of the orientation of the vehicle by rotating the body-fixed frame about the vehicle cg and keep track of the vehicle’s location by translating the

cg with respect to the inertial frame The translation representation is simply the vector distance of the body frame from the inertia

Trang 36

2.1.4 Euler Angles

Figure 2.4: The idea of Euler angles

The idea of Euler angles is that one can represent any final orientation of a

reference frame via a set of 3 sequential rotations about specific axes Let’s try to do

an example rotation about different axes to get a feel for how this works [3]

The transformation matrix is referred to as a Direction Cosine Matrix or

Rotation Matrix which we will represent as 𝐑 These matrices are orthonormal

(comprised of orthogonal unit vectors) and therefore have the property that

For our modeling we will follow the standard commonly used in the aircraft

industry which uses

] [

cos(ψ) sin(ψ) 0-sin(ψ) cos(ψ) 0

0 0 1

] [

xy

z] (2.2)

To make things a little easier to read and fit on the page, I’m going to use a

shorthand for the trigonometric functions: 𝑐(𝜙) = cos(ϕ) and 𝑠(𝜙) = sin(ϕ)

] [

xyz]

(2.3)

Trang 37

To simplify our notation, we will use a variable Cij to refer to these coordinate transformations, where the transformation is from a reference frame i to a different reference frame j Therefore, the above matrix would be expressed as a transformation from the inertial frame, n (as in navigation), to the body-frame, designated b:

[

b1

b2

b3]=Cnb[

xy

z]

(2.4)

2.1.5 Describe Motion

2.1.5.1 Simultaneous Translation and Rotation – The Chain Rule

Now that we have established we want to represent inertial motion in the vehicle coordinate frame, we must understand what that means mathematically In our derivation of the equations of motion we will want to take the derivative of vectors which are expressed in the body frame

Using the rules of calculus, we must use the chain rule of derivation to account for both the change due to the time derivative of the vector within the coordinate frame, as well as the time derivative of the coordinate frame’s rotation

Let me show this with a scalar expression of the time differentiation of the fixed velocity vector:

body-𝑣b = [

uvw]

Trang 38

The cross-product 𝜔 × v can be rewritten as the skew-symmetric cross-product matrix, [𝜔 ×], premultiplying the velocity vector

2.1.5.2 Newton’s Second Law of Motion

We will begin with Newton’s 2nd Law which is commonly summarized as force (𝐅) equals change in momentum (p), or for systems with constant mass, force equals mass (𝑚) times acceleration (𝑎) — the derivative of velocity (v):

We can simplify the equation to moment (𝑀) equals the change in angular momentum (𝐻), or for systems with constant mass distribution, moment equals moment of inertia (𝐼) times angular acceleration (Ω) — the derivative of angular velocity (𝜔)

2.1.5.3 Force, Mass, and Acceleration

We will be deriving the equations for translation and rotation separately, but viewing the equations side by side can help illustrate the similarities between the two

Translation

𝑭 = md𝑣

Trang 39

Inertial velocity in the body-fixed coordinate system will be expressed as:

𝑣b = [u, v, w]T (2.11)

Using the chain rule for a rotating body frame, the law of motion becomes:

𝑭 = m(𝑣̇b + ωnb × 𝑣b) (2.12) Rotation

The thrust force is represented for each of the 4 propellers as 𝐹1, 𝐹2, 𝐹3 𝑎𝑛𝑑 𝐹4 And all are aligned parallel with the vertical axis of the quadcopter, 𝑏̂3

The force vector within the body frame coordinate system can then be represented by:

[

𝐹𝑥

𝐹𝑦

𝐹𝑧] = [

00

−𝐹1− 𝐹2− 𝐹3− 𝐹4

Note that since our body frame convention has positive 𝑏̂3 in the down direction, thrust from the propellers is negative

Trang 40

Figure 2.5: The thrust force is represented for each propellers

a) Rotate clockwise and counter-clockwise;

b) Distance to center gravity The moments are represented with components L, M, N for rotations about the 𝑏̂1, 𝑏̂2, 𝑏̂3 axes, respectively As a moment is measured by force times distance, we’ll need the motor distances from the center of gravity, as shown in the image above [4] For pitch and roll, half the motors provide positive moment, and the other half provide a negative moment, depending on which side of the axis they are located

𝐿 = 𝐹1𝑑1𝑦− 𝐹2𝑑2𝑦 − 𝐹3𝑑3𝑦 − 𝐹4𝑑4𝑦 (2.17)

𝑀 = −𝐹1𝑑1𝑥+ 𝐹2𝑑2𝑥− 𝐹3𝑑3𝑥+ 𝐹4𝑑4𝑥 (2.18)

For yaw, the quadcopter takes advantage of its four rotating propellers Half of the propellers are installed to rotate clockwise (#3 and #4) and the other half counter-clockwise (#1 and #2) Each propeller as it pushes against the air will also apply a torque about the cg of the quadcopter in the direction of its rotation as a function of its thrust, propeller radius, and distance from the cg We will define this torque as a function 𝑇(𝐹, 𝑑𝑥, 𝑑𝑦), and can express the yaw moment of the vehicle as:

𝑁 = −𝑇(𝐹1, 𝑑1𝑥, 𝑑1𝑦) − 𝑇(𝐹2, 𝑑2𝑥, 𝑑2𝑦) + 𝑇(𝐹3, 𝑑3𝑥, 𝑑3𝑦) + 𝑇(𝐹4, 𝑑4𝑥, 𝑑4𝑦) (2.19)

2.1.5.5 Gravity Force in Body Coordinates

Gravity always acts towards the center of the Earth, and is expressed in the

a) b)

Ngày đăng: 26/07/2020, 00:50

TỪ KHÓA LIÊN QUAN

TÀI LIỆU CÙNG NGƯỜI DÙNG

TÀI LIỆU LIÊN QUAN

w