Final Year Project Report - BE Mechanical Engineering
Final Year Project Report - BE Mechanical Engineering
Final Year Project Report - BE Mechanical Engineering
Authors
ALI AMMAR NAQVI
10-NUST-BE-ME-19
KOMAL ZULFIQAR
10-NUST-BE-ME-101
10-NUST-BE-ME-50
Supervisor
Dr.YASAR AYAZ
HoD, Robotics & Artificial Intelligence
10-NUST-BE-ME-19
KOMAL ZULFIQAR
10-NUST-BE-ME-101
10-NUST-BE-ME-50
BS Mechanical Engineering
Thesis Supervisor:
Declaration
We certify that this research work titled Autonomous Robotic Wheelchair is our own work. The
work has not been presented elsewhere for assessment. The material that has been used from other
sources has been properly acknowledged / referred.
Signature of Students
KOMAL ZULFIQAR
10-NUST-BE-ME-101
Signature of Students
10-NUST-BE-ME-19
KOMAL ZULFIQAR
10-NUST-BE-ME-101
10-NUST-BE-ME-50
Signature of Supervisor
Dr.YASAR AYAZ
HoD, Robotics & Artificial Intelligence
ii
Copyright Statement
Copyright in the text of this thesis rests with the student authors. Copies (by any process)
either in full, or of extracts, may be made only in accordance with instructions given by the
authors and lodged in the Library of NUST, SMME. Details may be obtained by the
Librarian. This page must form part of any such copies made. Further copies (by any
process) may not be made without the permission (in writing) of the authors.
The ownership of any intellectual property rights which may be described in this thesis are
vested in NUST, SMME, subject to any prior agreement to the contrary, and may not be
made available for use by third parties without the written permission of the School of
Mechanical & Manufacturing Engineering (SMME), which will prescribe the terms and
conditions of any such agreement.
Further information on the conditions under which disclosures and exploitation may take
place is available from the Library of NUST SMME, H-12, Islamabad.
iii
Acknowledgements
We are thankful to our Creator Allah Subhana-Watala for having guided us throughout this project
at every step and for every new thought that He setup in our minds to improve it. Indeed we could
have done nothing without His priceless help and guidance. Whosoever helped us through the
course of our thesis, whether our parents or any other individuals was His will, so indeed none be
worthy of praise but Him.
We are profusely thankful to our beloved parents who raised us when we were not capable of
walking and continued to support us throughout in every department of our lives.
We would also like to express special thanks to our supervisor Dr. YASAR AYAZ for his help
throughout our thesis and also for Intro to Mechatronics & Robotics course which he has taught
us. We can safely say that we haven't learned any other engineering subject in such depth than the
one which he has taught.
We would also like to pay special thanks to Lab Engineer Khawaja Fahad Iqbal for his tremendous
support and cooperation. Each time we got stuck in something, he came up with the solution.
Without his help we wouldnt have been able to complete our thesis. We appreciate his patience
and guidance throughout the whole thesis.
We would also like to thank Dr. Omer Gillani and Dr. Mohsin Jamil, for being on our thesis
guidance and evaluation committee and express our special Thanks to Lab Engineer Ahmed
Hussain Qureshi for his help. We are also thankful to Ali Bin Wahid, Usama Siraj and Muhammad
Affan for their support and cooperation.
Finally, we would like to express our gratitude to all the individuals who have rendered valuable
assistance to our study.
iv
Abstract
The Autonomous Robotic Wheelchair is being designed to enhance the capabilities of the
joystick-controlled SAKURA Wheelchair made available through SAKURAs collaboration with
RISE Lab, SMME. While an electric wheelchair can be successfully navigated by most persons,
people with Parkinsons, Multiple Sclerosis or other severe mental difficulties remain unable to
navigate a standard automated wheelchair in the same way. We have endeavored to design the
Autonomous Robotic Wheelchair so that these people can rely on our smart wheelchair to do the
navigation for them. Our Final Year Project is a small step towards this ambitious and important
goal being pursued by a variety of student teams working under the umbrella of the Mobile
Robotics Group at RISE Lab, SMME. Our project goals required that the wheelchair should be
able to perceive the layout of its surroundings, localize itself within that layout, and given
particular start and endpoints, it should be able to steer itself from one to the other following a
reasonable collision-free path. To this end, we analyzed the hardware configurations of various
other autonomous wheelchairs and adopted the hardware configurations which appealed to us the
most. We modified the existing wheelchair hardware to incorporate an LRF sensor and a tableunit for an on-board laptop and electronics housing. Following that, we developed our obstacle
avoidance and navigation codes based on C++ and Player\Stage. Our obstacle avoidance codes are
making use of real-time laser sensing by the LRF and our navigation codes are being controlled
either by the potential field navigation algorithm or by the initial position and headings provided
by the Player\Stage cfg files. Our obstacle avoidance algorithms are being monitored by
Player\Stage running on the Ubuntu open-source platform which conveys LRF inputs to our C++
code and our C++ code, in turn, sends the forward and turning velocities to our windows-based
LABVIEW program over a secure TP-Link Wireless Network. Once our Wheelchair becomes
fully functional, probably after a series of coding modifications, integration of further types of
sensors, rigorous lab-testing and optimization for environments bigger than the RISE Lab test
space, we will then endeavor to prototype and market this wheelchair as a healthcare product meant
to provide mobility and enhanced independence to the disabled and mentally challenged people
we
hope
to
assist.
Table of Contents
Declaration ....................................................................................................................................................................i
Language Correctness Certificate ............................................................................................................................. ii
Copyright Statement ................................................................................................................................................. iii
Acknowledgements .....................................................................................................................................................iv
Abstract .......................................................................................................................................................................vi
Table of Contents ................................................................................................................................................ vii,viii
List of Figures ............................................................................................................................................................ xx
List of Tables ...............................................................................................................................................................xi
CHAPTER 1: INTRODUCTION...............................................................................................................................2
1.1
1.2
Hardware .....................................................................................................................................................2
1.2.1
1.2.2
1.2.3
NI-DAQ ..............................................................................................................................................6
1.3
Software .......................................................................................................................................................6
1.3.1
Ubuntu ................................................................................................................................................6
1.3.2
Player/Stage ........................................................................................................................................6
1.3.3
NetBeans .............................................................................................................................................6
1.3.4
LabView .............................................................................................................................................6
Motivation...........................................................................................................................................4
2.1.2
2.2
Fabrication ...................................................................................................................................................5
2.2.1
2.2.2
2.2.3
3.2
3.3
3.4
4.2
4.3
vii
4.4
Simulations .................................................................................................................................................5
5.2
5.3
5.4
5.5
CHAPTER 6: NETWORKING....................................................................................................................................
6.1
Introduction ...................................................................................................................................................
6.1.1
6.1.2
6.2
Scope ..........................................................................................................................................................4
6.3
6.4
Server/Client ...............................................................................................................................................5
6.4.1
Sockets .................................................................................................................................................5
6.4.2
IP Adresses ..........................................................................................................................................5
6.4.3
6.5
Applications ..................................................................................................................................................4
7.2
7.2.2
90 Avoidance .....................................................................................................................................4
7.2.2
7.3
7.4
Conclusion ....................................................................................................................................................
8.2
8.2
APPENDIX A............................................................................................................................................................. 45
REFERENCES .......................................................................................................................................................... 49
viii
ix
List of Figures
Fig 1.1: SAKURA Electric Wheelchair MC 20001
Fig 1.2: LRF Sensor...3
Fig 1.3: NI 6009 USB3
Fig 1.4: Player\Stage User Interface..5
Fig 1.5: LabVIEW Graphical Interface ....6
Fig 2.1: Wheelchair Mechanical Design Features.7
Fig 2.2: PRO\E Model for the Wheelchair8
Fig 2.3: Final Hardware Representation of the Wheelchair..9
Fig 2.4: Wheelchair Fabrication Materials 10
Fig 2.5: Wheelchair Fabrication Processes 11
Fig 2.6: LRF Sensor Mount..13
List of Tables
xi
CHAPTER 1: INTRODUCTION
1.1
The Autonomous Robotic Wheelchair is one of the flagship research projects of the Modular
Robotics Group at RISE Lab, SMME. We are working on the SAKURA automated wheelchair
MC-2000 and incorporating sensors onto the wheelchair in order to enable intelligent motion.
For the purpose of our Final Year Project, we have worked with an LRF sensor and used it to
enable the wheelchair to perceive the obstacles in its environment and to navigate its path around
those obstacles. Other undergraduate and graduate groups working on the Autonomous Robotic
Wheelchair are working with a large variety of sensors such as the webcams and headsets etc.
For us, the motivation for working on the Autonomous Wheelchair project was the opportunity
to work on something that is at the cutting edge of robotics research and simultaneously a
comprehensive engineering challenge in itself. Not only that the autonomous wheelchair is also a
deeply humanitarian project as well for it seeks to empower and mobilize disabled people
suffering from Parkinsons, Multiple Sclerosis and Tetraplegia etc.
The scope of our project is to enhance the capabilities of the SAKURA wheelchair in order to
stimulate high maneuverability and navigational intelligence for the wheelchair. It is desired that
the wheelchair should be able to perceive the layout of its surrounds, localize itself within that
layout and given particular start and end points, the wheelchair should be able to steer a
reasonable collision-free pathway.
1.2
Hardware
1.2.3
NI-DAQ
1.3
Software
1.3.1
UBUNTU
1.3.2 PLAYER\STAGE
The Player Project creates Open-Source software under the GNU Public License that enables
advanced research in robotics. It is the most widely used robot control interface in the world.
The Player robot device interface provides a network server for robot control. Player has a clientserver architecture which allows robot control programs to be written in any programming
language and to run on any computer that has a network connection to the robot. Your Player
clients can talk to Player over a TCP socket, reading data from sensors, writing commands to
actuators, and configuring devices on the fly.
The Stage is Players backend simulator. It simulates multiple
robots moving around concurrently and sensing a two dimensional
bitmap environment. The sensor models provided include sonar, LRF,
webcam and odometery. But Players modular architecture makes it
easy to add support for new hardware and commonly the control
programs written for Stages simulated robots work just as well on
real hardware as well.
NETBEANS
LABVIEW
such as the NI 6009 USB which we have used, and LabVIEW has inbuilt
features for connecting our programs to the web using the LabVIEW web
server and software standards such as TCP/IP networking.
For our project, we have used LabVIEW in order to receive our
velocity values from across the TP-Link wireless network. We feed in our
laptops IP address to enable a Windows-based computer and Ubuntubased computer to network. We use NetBeans and TCP broadcasting to
repeatedly update our forward and angular velocity values for the
wheelchair. LabVIEW translates these into power values for the
wheelchairs differential drive and these variable power signals are then
sent to the wheelchair control using the NI DAQ USB. The following is a
snapshot of LabVIEWs
interface.
2.1 Design
We were required to make a design which was totally detachable and could accommodate LRF
(Laser Range Finder) on it. Furthermore we were also required to design a laptop table in front of
the wheelchair user which could simultaneously serve as a mounting for sonar sensors or
webcams as well. This table also was also meant to have a shelf where different electronic
circuitry could be placed.
So we designed the following new features in our wheelchair:
i.
ii.
Light
Weight
Mechanically
Stable
Detachable
10
2.1.1 Motivation
We studied various designs of the autonomous wheelchairs that are being developed around the
world in different universities like:
11
2.2 Fabrication
For fabrication purposes, we
used the
workspace of
Manufacturing
LRF Mount
ii.
Laptop Table
iii.
namely:
LRF Mount
Stainless
Steel
Aluminium
Cast Iron
12
Wood
Welding
Riveting
Drilling
Threading
Cutting
LRF Sensor
LRF Sensor Mount
(i)Aluminium Sheet
(ii) Cast Iron hollow pipe
(iii) Bush
Laptop Table
Laptop
LabVIEW
In order to make wheelchair autonomous, we shiftedits control from joystick to LabView. For
this purpose the wires were detached from the joystick and they were inserted into the NI-DAQ.
NI-DAQ is the real time USB developed by National Instruments and it connects hardware with
software. In this case hardware is our wheelchair and software is LabView.
Wheelchair
NI-DAQ
14
LabView
Description:
This LabView program was made to control wheelchair motion. Previously this was controlled
by joystick. A continuous while loop is implemented so that a required value is being sent to
wheelchair at all times. There is a stop button in the program by which code can be terminated at
any time. Actually an angular velocity and a forward velocity values are given to the program
15
then these values are converted to a specific voltage value. This conversion is done by two
functions; one made for forward speed and another for angular speed. This function is made by
using straight line equation i.e. y= mx + c on the velocity-voltage graphs (graph is explained in
the next section). These voltage values for forward and angular speed the then send to wheelchair
via NI-DAQ. Constant voltages of 1.85 volts are being provided to wheelchair when wheelchair
is stationary.
Voltages were provided by LabView software via NI-DAQ. RPM of the rare wheels were
measured by the tachometer. Then velocity was calculated on the basis of rpm.
16
Voltage-Velocity Graph
1.5
Velocity (m/s)
1
0.5
0
0
0.5
1.5
2.5
3.5
-0.5
-1
Voltage (V)
Fig 17: Voltage-Velocity Graph
The graph above shows the trend of voltage vs. velocity. We gave wheelchair different sets of
voltages through LabView and then measured the resultant speed of wheelchair by measuring
rpm of its rare wheels.
Following conclusions can be drawn from the above graph:
There is a voltage window where wheelchair does not move due to inertia and that is 1.25
volts 2.3 volts
The main purpose of drawing this graph was to know exactly about the speed output when a
specified voltage is provided. This knowledge helped during the developing of obstacle
avoidance and navigation codes and it also helped us during real-time wheelchair testing phase.
17
Literature Review
18
19
20
4.2
Pseudo-Code
From Jennifer Owens How to Use Player\Stage manual we
were able to obtain a basic obstacle avoidance function which we were
able to expand upon in different ways. Our obstacle avoidance
functions attempt to update motor speeds if there is an obstacle to be
avoided. We update the function just before sending data to the
motors. Thus, obstacle avoidance overwrites onto our navigation
behavior for the robot.
Once the obstacle is no longer in its way, the robot moves along as it
was. This allows us to transition from any navigation behavior into
obstacle avoidance and then back again, as per the requirements of
our control structure.
lp.GetMinLeft() in order to see to the left and right of the robot for
obstacles. The readings obtained are processed and we check to see if
either the left or the right distances to the obstacle are greater than
100. Based on that, we calculate robots newspeed and newturnrate
which is limited to values of -40 to +40. And use the SetSpeed
command to set these motor speeds.
23
24
initial speed for which the wheelchair would move. For power values
below this velocity level, the wheelchair does not move. Thereafter,
inside a continuous for loop we use Robot.Read() command to update
LRF values obtained from the LRF attached to the wheelchair.
Details of how we can set up Player\Stage in order to read in these
LRF values will be covered in the next section. As before, we read in
LRF readings of minimum distances at angles 88 to 91 using the
lp.GetRange() command. And we check to see if at some point all of
these GetRange() values would be less than 1.3 or some other specified
distance directly in front of our robot? If the obstacle is present, the
robot halts and reads its X and Y coordinates using the
robot.GetXPos() and robot.GetYPos() commands. Thereafter, it uses
those x and y positions to navigate around the obstacle using a
sequence of SetSpeed commands that take the robot out of its path,
make it cross the obstacle and makes it come back to its path after
having crossed the obstacle. Lastly, it checks to see if the endpoint has
been reached at which point the robot stops.
Pseudocode for SetSpeeds-based Obstacle Avoidance:
Enable Motors
Initialize ForwardSpeed & TurningSpeed to zero
Initialize ForwardSpeed to 0.8
Use Robot.Read()
Use GetRange() to look in front of the robot
Check to see if obstacle in front close than set distance (1.3)?
If Yes, read robot coordinates at current position
Use SetSpeed commands with current position coordinates to move
around obstacle
Else Continue moving along original path
25
4.3
ls l tty*
commands
sudo chmod +r ttyUSB0
ls l ttyUSB0
ls l ttyU*
27
4.4
Simulations
28
29
CHAPTER 5: NAVIGATION
5.1
Literature Review
31
5.2
For Potential Field navigation we start out by creating a two dimensional discretized
configuration space of X and Y coordinates. We can then attach a potential value to each of these
grid configurations and denote as a potential function or gradient spread over the entire 2D field
within which the robot moves.
The potential field method treats the robot as a point under the influence of an artificial potential
field U(q). The robot moves by following the field just as a ball were rolling downhill under the
action of gravity. The basic idea is that the endpoint is pulling an attractive potential onto the
robot and the obstacles within the robots environment are exerting a repulsive potential onto the
robot. Thereafter the potential field of the robot is computed as a sum of the attractive field of the
goal and the repulsive fields of the obstacles.
U(q) = Uattr(q) + Urep(q)
At every point in time, the robot will look at the potential field vector at that point and
start going in that direction. The attractive and repulsive potentials can be found using any
of the linear or quadratic formulae with various scaling factors for the endpoint attraction
and the repulsion by obstacles. Some of the problems associated with the potential field
algorithm are the Local Minima Problems in which robot is unable to navigate its way to the
endpoint of the repulsive potentials due to the obstacles chocking its pathway.
32
5.3
BrushFire PseudoCode
In order to implement the Potential Field Navigation method for our code, we implemented the
BrushFire Navigation scheme. The BrushFire algorithm is a clearance-based path finding
algorithm which overlays a grid onto the entire navigational space of the robot.
In the BrushFire algorithm weve represented the 2D navigational space of our robot as a grid of
pixels. For example, if weve used a 400 X 400 jpg image for our robot environment, we will
first break it down into a grid of blocks. Each of the block is then checked using nested iterative
for loop of pointers, if the pixels are occupied by obstacles we mark them with 1s and if theyre
free and the robot can travel to them we mark them as zero.
In order to compute the attractive potential of the endpoint we compute it as a function of the
distance of each pixel from the goal. On the other hand, the repulsive potential of the obstacles is
generated by traversing the map such that at each pixel visited the 8 neighbors connected to it are
updated to a value one greater than the value of the current pixel. Doing so we have a grid where
the occupied pixels are blocked out and the empty pixels close to the obstacles have higher grid
values than those adjacent to them and the pixels as they go towards the endpoint have a
decreasing grid value.
33
The architecture of our BrushFire Navigation program was made in the following way:
(i)
CreateMap:
the CreateMap program takes a jpg or a png image as its input and it overlays a grid onto the
image marking the occupied grid blocks as (1) and the unoccupied free space grid blocks as (0)
(ii)
BrushFireMap:
the BrushFireMap program generates the BrushFire grid. It uses nested for loop to access every
grid block and for each occupied grid block marked with (1) provided by the CreateMap grid it
updates the cells around those obstacles to show repulsion. Then it goes around to the cells
around those updated blocks and keeps updating values on the grid to show to the rippling effect
of the repulsion produced by the obstacles in the navigation space.
(iii)
DistanceMap:
the DistanceMap program is used to generate grid values to denote the attractive potential of the
endpoint for the robot. Once again we make use of nested for loops to iterate to all the grid
blocks and the blocks closer to the endpoint are marked with lower and lower values with the
endpoint having the value of zero and the starting point having the maximum value on the grid
(iv)
PotentialMap:
the PotentialMap program relies on the outputs of the BrushFireMap denoting the repulsion
potential and the DistanceMap denoting the attractive potential. The PotentialMap program uses
the grid output from both to compute a grid showing the net potential function at each point. The
potential function is computed as follows:
Potential Function = P * DistanceMap[x][y] + N*pow(1/BrushFireMap[x][y], 3)
The values of P and N are varied by looking at the results of different simulation trials. Usually P
is around 20 and N is on the order of 10^5
(v)
RunBrushFireMap
5.4
BrushFire Simulations
We have tested the BrushFire in Player\Stage simulations only. We were able to obtain
successful test trails in some of the BrushFire simularions but not all. Some of the problems
arose to the programming complexity for the algorithm and due to the local minima problems of
the BrushFire algorithm.
The following are screen shorts of our BrushFire Simulations which enables the robot to move
from start point (provided by the cfg file) to endpoint (specified in the code). We started out by
using simpler jpg images for testing as shown:
Eventually we progressed to more complicated scale model representations of our RISE Lab
workspace and more realistic model-type obstacles placed within that workspace. In almost all of
these simulations the robot was able to determine the shortest cost path to its end point specified.
However, perhaps making the obstacles as models within the cfg file or for some other reason
the BrushFire algorithm almost always crashed during more challenging test runs.
35
5.5
Run-Time Navigation
We were not able to use our BrushFire programs for real runtime navigation. There were not only logical problems in
implementation of the algorithm but also the test constraints which
became a significant hurdle in the way. For example, we need to
provide forward and angular velocities in order to physically drive
our wheelchair using the NI 6009 USB. However, the BrushFire code
does not navigate based on velocity variables, it navigates based the
relative grid values from start point to end point. Logically we could
not translate the output of the latter into giving velocity values to the
power drive of our wheelchair. Another significant problem was the
small test space and the limited scope of obstacle avoidance and
navigation we could afford within that test space as well as the front
two wheels of the wheelchair being idle and causing significant
deviations betweens the expected and actual wheelchair motion.
36
CHAPTER 6: NETWORKING
6.1
Introduction
networking between computers allows communication between different users and programs,
thus enabling sharing of information between computers located at different locations. The
information shared on networks is accessible to all the computers connected on the network.
There are two main types of Computer networks,
(I)
Wireless Network
(II)
Wired Network
We used Wireless Networking for our project as it enabled us to communicate with the
wheelchair from a distance without any wired connection, therefore enabling free motion of the
Wheelchair without any hindrance.
6.1.1 Wireless Networking
Wireless networking is a flexible data communications system, which uses wireless media
such as radio frequency technology to transmit and receive data over the air, minimizing the
need for wired connections. Such networks are used to replace wired networks and are most
commonly used to provide last few stages of connectivity between a mobile user and a wired
network.
37
Wireless networks use electromagnetic waves to communicate information from one point to
another without relying on any physical connection. Radio waves are often referred to as radio
carriers because they simply perform the function of delivering energy to a remote receiver.
The data being transmitted is superimposed on the radio carrier so that it can be accurately
extracted at the receiving end. Once data is superimposed (modulated) onto the radio carrier,
the radio signal occupies more than a single frequency, since the frequency or bit rate of the
modulating information adds to the carrier. Multiple radio carriers can exist in the same space
at the same time without interfering with each other if the radio waves are transmitted on
different radio frequencies. To extract data, a radio receiver tunes in one radio frequency while
rejecting all other frequencies. The modulated signal thus received is then demodulated and
the data is extracted from the signal.
6.1.2 Advantages of Wireless Networking
Wireless networks offer the following advantages over traditional wired Networks:
Mobility: They provide mobile users with access to real-time information so that they can
roam around in the network without getting disconnected from the network. This mobility
supports productivity and service opportunities not possible with wired networks.
Installation speed and simplicity: Installing a wireless system can be fast and easy and can
eliminate the need to pull cable through walls and ceilings.
Reach of the network: The network can be extended to places which cannot be wired
More Flexibility: Wireless networks offer more flexibility and adapt easily to changes in the
configuration of the network.
Reduced cost of ownership: While the initial investment required for wireless network
hardware can be higher than the cost of wired network hardware, overall installation
expenses and life-cycle costs can be significantly lower in dynamic environments.
Scalability: Wireless systems can be configured in a variety of topologies to meet the needs
of specific applications and installations. Configurations can be easily changed and range
from peer-to-peer networks suitable for a small number of users to large infrastructure
networks that enable roaming over a broad area.
38
6.2
Scope
In this project we used two computers operating in parallel. The first computer had Ubuntu as an
Operating System with Player/Stage and NetBeans. The working algorithm gave the velocity
output stored in variable form that was to be transmitted. We used wireless communication to
transmit these velocity values from the navigation algorithm to the LabView operated computer.
This LabView code further translated these values into voltage outputs, thereby enabling the
accurate movement of the wheelchair
Velocity
6.3
Server
Client
Literature Review
The Networking task required a lot of literature reading as the coding for creating a network and
data transfer was a fairly new for us. This involved the study of standard networking tutorials
and manuals. Following are some of the Tutorials that were used for Networking.
(I)
(II)
6.4
Server/Client
The Network communication is based on a server/client system. We used the server to transmit
float type data to the client by casting it into a binary value. The Client interpreted this data type
in binary and then re-calculated the values in float type data.
6.4.1 Sockets
The first step for creating a server was to create a parent socket. There are two types of sockets
Stream sockets and Datagram sockets. We used Stream Sockets for our server because Stream
sockets are reliable two-way connected communication streams. If we two output items into the
socket in the order 1, 2, they will arrive in the order 1, 2 at the opposite end and they will be
error free.
39
6.4.2 IP addresses
The IP address of the parent socket connection to the server was known. It was in the form of
IPv4. This IP address was used by the client program to connect with the server on the network
and pick up the data.
Little-Endian: When representing a two byte digit like b34f, this type of byte order
will store the sequential bytes 4f followed by b3 in the memory. This storage method
is called Little-Endian.
Big-Endian: When we want to represent a two byte like b34f, then well need to
(II)
store it in two sequential bytes b3 followed by 4f. This number, stored with the big
end first, and the small end last is called the Big-Endian order
Thus we understand that for Intel microprocessors that use little-Endian, The computer may have
been storing bytes in reverse order. To counter this problem we cast the element in binary order
and use the function network byte order that makes sure that the byte order remains the same in
both computers. We make use of the long type variables that enable four byte storage.
6.5
Application
Setting up a Server
(II)
(III)
(IV)
40
7.1
Testing Limitations
7.2
obstacle. After detecting the obstacle the wheelchair moves out of its path horizontally, it
crosses the obstacles at a specified distance from it and comes back to its original path line
horizontally after making yet another 90 turn. Problems arise during 90 Obstacle
Avoidance when the front wheels behave differently and trace a different angle (not
equivalent to 90) from one trial run to the other. A possible solution for this is to keep the
testing conditions as similar as we can so that test results can be repeated based on our time
calibrations. Video 1 provided shows Single Obstacle Avoidance with 90 turns.
42
Fig 25:
90 degrees and Angular Obstacle Avoidance
7.3
Our Multiple Obstacle Avoidance code makes use of the Single Obstacle Avoidance with 90
turns consecutively. Like before, wheelchair is prepped for motion by giving it zero velocity
values. After a while, it is given a forward velocity of 0.8m to get it started. When the first
obstacle is detected, the wheelchair halts and makes a 90 turns and moves horizontally to avoid
the first obstacle. It continues on its new path this time and as it encounters a second obstacle in
front of it. It again makes a 90 turn and avoids the obstacle by moving horizontally.
Another slight variation we could have programmed for this code could have been multiple
obstacle avoidance such that the obstacle returned to its original path along which it started. It is
easily possible by making slight modifications on our current code.
43
44
8.1
Conclusions
The Autonomous Robotic Wheelchair is an important technical solution to the problems posed
by the traditional powered wheelchairs. It can enable the end-user to use the wheelchair by
himself without the need of any other person, irrespective of their disability or physical
condition. For people with Parkinsons, Multiple Sclerosis or Tetraplegia, an autonomous
wheelchair can be a means to an independent, fulfilling life.
In the course of our Final Year Project entitled Autonomous Robotic Wheelchair we have been
able to make decent progress towards our dream of developing such a wheelchair here in
Pakistan. We could not have gotten anywhere had it not been for the SAKURA automated
wheelchair provided to us at RISE Lab. The automated wheelchair had the electronic control and
the hardware which was a necessary prerequisite for our project.
We were able to modify the existing hardware of the wheelchair to mount our LRF and the
laptop table with electronics housing. Our hardware design is modular, simplistic and completely
detachable. In the course of future work on the wheelchair hardware, subsequent sensor mounts
can the designed either on top of our own mounts or in place of the holes and fasteners that we
have already used.
As for the project tasks of Obstacle Avoidance and Navigation, we were able to make
respectable progress on both the fronts. Not only could we successfully implement Obstacle
Avoidance and Navigation in our simulations, we were able to implement both during our trails
runs in the lab. However, only on a more restricted scale and approximating the behavior of our
original algorithm commands by using sleep calls with different time values.
Finally, we were able to successfully upgrade the wheelchair from its joystick control to laptop
control while making use of sensor readings all in real time. A robust LabVIEW program was
used to drive the wheelchair by giving it power/ velocity values and Player\Stage was used to
45
collect data from the LRF for successful obstacle avoidance in almost every single trial run.
We took the first steps under the guidance of Dr. Yasar Ayaz and Sir Khawaja Fahad Iqbal and
eventually the road ahead became clear step by step. We started by getting acquainted with
Ubuntu and the Player\Stage interface. We studied from the Player\Stage manual and built and
tested several smaller Player\Stage programs. Eventually we started integrating our C++ coding
into our Player\Stage Simulations and programming for more complicated problem scenarios.
We learning coding in LabVIEW with Ali Bin Wahid who had already developed a small
LabVIEW program to convert the wheelchairs joystick control to laptop control. We
complicated upon his original program as well incorporate the Wireless settings and TCP
broadcasting for LabVIEW so that LabVIEW could communicate in real time with our
Player\Stage code.
Another challenging task was setting up the Player\Stage codes to work on the hardware rather
than in simulations. The LRF had to be upgraded from the simulations proxy to the USB Port 0
proxy and next, we had to network the computers. The crux of our networking was based on the
wireless TP-Link and in our C++ code that used sendData commands repetitively each time a
SetSpeed call was made in our code.
Another learning outcome of the project was getting to understand the actual behavior of the
wheelchair as it differed from the faultless simulations we could create in Player\Stage. Several
test variables modified the behavior of our wheelchair in the real time e.g. the surface friction of
the test space, the uncontrollable variations in the front wheels motion and the amount of battery
46
8.2
Future Recommendations
Future work on the Autonomous Robotic Wheelchair could involve improvements on the current
Obstacle Avoidance and Navigation theorems. We have only made use of the most simplistic
algorithms at this point which are good enough for room level movements but in order to make
the wheelchair more enhanced we need to move towards building level navigation and for that
complication navigation algorithms like the BrushFire or the DFS or BFS algorithms need to be
implemented for wheelchair motion.
The wheelchair hardware also needs slight improvements. In
its current state, the wheelchair table is not positioned correctly to
accommodate fully the wheelchair users legs. The table platform
needs to be raised and also contraptions need to be installed for other
sensors.
Lastly there is also a lot of scope for improvement in the design of our
current LabVIEW program for driving the wheelchair. As it is our
LabVIEW program is fairly simplistic and is merely translating the
velocity values being received from the code. We need to develop a
LabVIEW program that could interact with multiple Player\Stage
C++ codes reading and processing data from multiple sensors. Once
we have developed techniques to incorporate such processing within
LabVIEW itself the wheelchair will be become vastly more intelligent
than it is right now.
47
APPENDIX A
48
REFERENCES
http://playerstage.sourceforge.net/
http://playerstage.sourceforge.net/player/player.html
http://www.ni.com/pdf/manuals/371303m.pdf
http://beej.us/guide/bgnet/
http://beej.us/guide/bgnet/output/print/bgnet_USLetter.pdf
http://www.lantronix.com/resources/networking.html
[1] Ruffles, P.C.; 2001, Expanding the Horizons of Gas Turbine in Global Markets; ISABE
2001-1010
[2] Dring, R. P., Joslyn, H. D., Hardin, L. W., and Wagner, J. H., 1982, "Turbine Rotor-Stator
Interaction," ASME Journal of Engineering for Power, vol.104, pp.729-742
[3] Arndt, N., 1993, Blade Row Interaction in a Multistage Low Pressure Turbine, ASME
Journal of turbomachinery, Vol. 115, pp. 370-376
[4] Denton J.D., 1993, Loss Mechanisms in Turbomachines:, ASME Journal of
Turbomachinery, Vol.115
[5] Denton J.D., 1993, Loss Mechanisms in Turbomachines:, ASME Journal of
Turbomachinery, Vol.115
[6] R.E. Walraevens, A.E. Gnllus, Stator-Rotor-Stator Interaction inAn Axial Flow Turbine And
its Influence on Loss Mechanisms, AGARD CP 571, 1995, UK, pp 39(1-13).
[7] Dawes, W.N., 1994, A Numerical Study of the Interaction of Transonic Compressor Rotor
Over Tip Leakage Vortex with the Following Stator Blade Row, ASME Paper No. 94-GT156
[8] Sharma, O,P., Renaud, E., Butler, T.L., Milasps, K., Dring, R.P., and Joslyn, H.D., 1988,
Rotor- Stator Interaction in Multi- Stage- Axial Flow Turbines, AIAA Paper No. 88-3013
49
[9] Busby, J.A., Davis, R.L., Dorney, D.J., Dunn, M.G., Haldeman , C.W., Abhari, R.S., Venable,
B.L., and Delany, R.A., 1999, Influence of Vane-Blade Spacing on Transonic Turbine Stage
Aerodynamics: Part II- Time- Resolved Data and Analysis:, ASME Journal of
Turbomachinery , Vol. 121, pp. 673-682
[10]
Bell Loyed, Three dimensional Unsteady flow analysis in vibrating Turbine Cascades,
Jeff Green, PHD Thesis, Controlling Forced Response of a High Pressure Turbine Blade,
50