Sunday, 25 May 2014

ROBOTICS


ROBOTICS
Robots are smart machines that can be programmed and used in many areas such as industry,
manufacturing, production lines, or health, etc. These robots perform hard, dangerous, and
accurate work to facilitate our life and to increase the production because they can work  without rest , and can do works like human but more precisely and with less time. Assistive mobile robots that perform different kinds of work over everyday activities in many areas such as industry, manufacturing, production lines, or health, etc. are very commonly used to improve our life.
The robots are categorized based on the type of control mechanism into two types:
Autonomous Robots
            These robots have the capacity to think itself and take decision on behalf of human on various aspects. This is due to deep development in the field of AI-artificial intelligence. These robots though are better have not become popular in market due to several reasons:
a.       Autonomous Robots have decision capabilities but at many places like nuclear power plants, decision must be taken by the expert persons handling the power plant and not by robot, else some disaster may occur.
b.      When a robot is used a spy, it must be handled by military authority as some decision require some harsh decision initially to get benefit later.
c.       Cost of making autonomous robots is very high.
In these autonomous robot fail, here we require a manually controlled robots i.e. Non-Autonomous Robots.
Non-Autonomous Robots
These robots have the programing logic to do the desired task but the decision power lies in the hands of the controller (human) handling the robot. Here the interfaces between the controllers can be made using methods:

a.       Wired: The connection between the controller and robot is maintained using wired interfaces. These interfaces can be serial or parallel and in both these techniques.
b.      Wireless: the connection between the controller and robot is maintained using wireless interfaces such as
                       

Wi-Fi
                        Bluetooth
                        Wi-Max
                        Zigbee etc.
Here the signals are transmitted wirelessly in air.

Wi-Fi is a wireless network based on a series of specifications from the Institute of Electrical and Electronics Engineers (IEEE) called 802.11. Wi-Fi uses unlicensed radio frequency, mostly in the 2.4GHz band. It enables a person with a wireless-enabled computer to connect to the Internet via a wireless access point.The geographical region covered by one or several access points is called a hot spot. Wi-Fi was intended to be used for mobile devices and localarea networks, but it is now often used for Internet access outdoors. There are several types of Wi-Fi:
802.11a (offering transmission speeds of 24mbps to 54mbps)
            802.11b (6mbps to 11mbps) and 802.11g (24mbps to 54 mbps)
            802.11n (50mbps to 100mbps)
This wireless technology can be combined with robots to develop a Wi-Fi controlled robot.The major benefit of using wireless technique to control the robot is that, we can receive the live information of the situation to the controller.The Wi-Fi controlled robot use Wi-Fi 802.11G standard for its control signals through TCP/IP protocol, which has flow control. This enables uninterrupted and reliable transmission of control signal to the robot vehicle. Wi-Fi support high data rates which enables good quality uninterrupted video transmission from the robot to the display device.


Android devices are powerful mobile computers with permanent internet connectivity and a rich variety of built-in sensors, GPS compasses and cameras, Bluetooth and high end processors running at an average of 500Mhz. Android uses the Java programming language. Getting startedwith the Android is easy; the APPlCATION is open. In addition, the Android APPLICATION allows easy access to the hardware components. Using this feature of android phone, the robots can be controlled through Wi-Fi. Wi-Fi enabled Robot is a system, whichcontrols the robot using the web page. To makethis feasible we make use of the Beagle Bone board and Wi-Fimodule.

Beagle Bone based Android Controlled Robot


OVERVIEW
1.1 Aim of the project
Primary objective:The aim of the project is to design and develop a robot, of which movementscan be controlled using the application in the android smartphone.
Secondary objective:Develop a method tocapture video signal of the surrounding environment and to display on the desired display module overWI-FI network.

1.2 Introduction
Spy Robotics is design and manufacture of intelligent machines that are programmed to perform specific tasks. Robots are generally designed to be a helping hand .they helps us in difficult, unsafe or boring tasks. Simply put, robots are machines that can be programmed to perform a variety of jobs, and they can range from simple machines to highly complex, computer- controlled system. Robotics is one of the most exciting areas of Electronics. Robotics is the field of controlling of electronic machines that can be substitute in the place of human actions. This field has become so advanced that in the near future robots can imitate human behavior.
Bridging the gap between the world of computer and that of its user has always been one of the chief goals of robotics. Graphical interface, input devices, speech generators, handwriting recognition system and face recognition system are just a few examples of how computers have become more accessible. Motion detection is a more recent thrust in this direction and represents a major step in bringing the computer into our world.
This project focuses entirely on two specific challenges, the task of video streaming and movement of the robot.Computers have become fast enough to perform computationally intensive image processing tasks and storage devices have grown to allow the accumulation of large database of images.
1.3 Problem statement and formulation
In wired controlled robot, the connection between the controller and robot is maintained using wired interfaces. These interfaces can be serial or parallel and in both these techniques, the underlying technology is transmission of the electrical signals, which are sent in form of specific patterns, and the robot to carry out the specific task analyzes these patterns. These signals sent are analyzed by a processor mounted on the robot.
       In wireless controlled robot, the connection between the controller and robot is maintained using wireless interfaces such as Wi-Fi.The underlying technology is transmission of signals wirelessly in air by the transmitter, which are captured by receiver and sent to the processor mounted on the robot to carry out the decisions.
   By using the android Smartphone, the robot is controlled through the WI-FI network and by installing camera in the same robot user can control more precisely and accurately.

1.4 Methodology
The project begins when the designed web page is accessed using android phone. The robot is controlled using the directional icon in the web page. The live video streaming of the surrounding environment is displayed on the desired display module over WI-FI network.

The control of robots includes three distinct phases

·         Perception
·         Processing
·         Intimation
·         Action

The Preceptors are the cameras mounted on the robot. The Processing is done by the Beagle Bone Board. The Intimation given via android phone to the robot. The action is performed using motors.

1.5Literature Review
‘A Wi-Fi Enabled Robot’ byMohammed Hisham, Sudhir V Prabhu, Ashwin Kumar [1]
Wi-Fi enabled Robot is a system, which controls the turtle robot using the web page. To make this feasible we make use of the HNZG1 board with a built in Micro controller and ZeroG-2100 Wi-Fi module. HNZG1 board is developed exclusively by MANIPAL DOT NET PVT. LTD comprises of ZeroG- 2100 module for wireless connectivity and inbuilt PIC24F series microcontroller from Microchip Technology Inc. Switch control mode of Robot is controlled using Joystick and can appropriately be guided to back, forth, left and right directions So in this project, we extend an interface to the Switch control mode to suit our requirements. Finally we have been able to bridge together the HNZG1 board and the Turtle Robot, and eventuallycontrol the Turtle Robot in wireless environment through the control buttons embedded on the custom Web page designed by us.

‘Towards Smarter RbotsWith Smartphones’ by Rafael V. Aroca,AntônioPéricles B. S. de Oliveira, Luiz Marcos G. Gonçalves [2]
Mobile phones are one of the top selling mobile devices in the world. Due to their large production, their prices have a high cost/benefit ratio. Current smartphones have a variety
of built-in sensors that can be explored to build robots. Using a smartphone as the "brain" of a robot is already an active research field with several open opportunities and promising possibilities. In this paper we present a review of current robots controlled by mobile phones and discuss a closed loop control mechanism that we have developed to control mechatronics systems using audio channels of mobile devices, such as phones and tablet computers. In our work, actuators commands are sent via audio and sensors reading are received by the phone also via audio using only analog electronics and no intermediate processing units

‘Using the Android Platform to controlRobots’ by Stephan G¨obel, Ruben Jubeh, Simon-LennertRaesch and Albert Z¨undorf [2]
Android devices are powerful mobile computers with permanent internet connectivity and a rich variety of built-in sensors. More properties make the Android system very applicable for university use: Android uses the Java programming language, which our students are familiar with. Getting started with the Android API is easy; the API is open, i.e. developers can access almost every low-level function and are not sandboxed. In addition, the Android API allows easy accessto the hardware components. Interesting for robotics use are the numerous communication interfaces like Wi-Fi, Bluetooth and GSM/UMTS, USB. Arduino boards provide open source software solutionsto control LEGO sensors and motors. In addition, Arduino boards provide connectivity for many other cheap sensors and actuators. This would lower the cost of a robot even more.



1.6 History
The First Remote Control Vehicle/Precision Guided Weapon:
This propeller-driven radio controlled robot, built by Nikola Tesla in 1898, is the original prototype of all modern day uninhabited aerial vehicles and precision guided weapons. In-fact all remotely operated vehicles in air, land or sea. Powered by a lead acid batteries and an electric drive motors, the vessel were designed to be manoeuvred alongside a target using the instruction received from a wireless remote control transmitter. Once in position, a command would be sent to detonate an explosive charge contained within the boat’s forward compartment. The weapon’s guidance system incorporated a secure communication link between the pilot’s controller and surface running torpedo in an effort to assure that control could be maintained even in the presence of electronic countermeasures.

Use of Remote Controlled Vehicles during World War 2:

During World War 2 in European Theatre the U. S Air Force experimented with three basic forms radio control guided weapons. In each case, the weapon would be directed to its target by a crew member on a control plane. The first weapon was essentially a standard bomb fitted with the steering controls. The next evolution involved the fitting of a bomb to a glider airframe, one version, the GB-4 having a TV camera to assist the controller with targeting. The third class of guided weapon was the remote controlled B-17

RESULTS AND DISCUSSION
5.1 Experimental Results
5.2 Discussion
·      The initial idea which led us to the current project was that we wanted to help the military sector of our country in any way possible. That is when while browsing in YouTube we came across a video of robot control, where a robot is controlled using android phone via Bluetooth. This gave us the idea of the project Android control Robot through Wi-Fi. We also had a plan of streaming the live video signal using web cam.
·      Our next step was to find out the method by which controlling robot and streaming the video can actually be realized. First we selected microcontroller for our project. Then we came to know about the advantages of BeagleBone over the microcontroller We decided to go with the BeagleBone even though it was costlier, just for the reason that it is new and we could try something better.
·      This then led us to which processor we thought would be best suitable.
·      Hence decided upon to use BeagleBone because of its stability and performance.
·      Then we collected material and designed schematic in altium software. We then designed our own PCB using the schematics having a trace width of .01mil
·      We then had problem in connecting the DC motor for or with the DC motor driver.At first we used LM298 motor driver circuit to control 5V DC motor. At the time of hardware testing, we faced problems due to motor driver . Hence we planned to replace it by  two-way relay circuit and then tested  the for sequence of coilactivation
·      Now as for the software we initially programmed the movement of DC motor for different conditions in BeagleBone without eclipse platform.
·      There was a problem in the directional rotation of the motor. Then we switched the connections to the motor from relay circuit.
·      For the rest part of the project we installed eclipse on Ubuntu operating system.
·      We designed user interface webpage for motor control and video streaming.
·      Burning of the code into the BeagleBone and installing MJPG streamer was the toughest phase of the project.
·      Then using the Wi-Fi router we accessed BeagleBone.
·      The designated webpage was accessed and the control panel appears.
·      We found a video streaming problem over Wi-Fi and came to know that the mistake was made while entering the port address of the streamer. This problem was easily solved.
·      We tried to check the video streaming on android phone and came to know that, the android phones greater than 512MB RAM are capable of streaming.
·      The video streaming was tried by using the VLC media player on the computer.
·      We faced a problem of limited range of Wi-Fi due to which robots have limited working range. But there is no solution for this problem unless the 4G technology is implemented.
·      Apart from Wi-Fi range problem, we successfully tested our project and led to the successful implementation of the idea.
















CHAPTER 6
CONCLUSION AND FUTURE ENHANCEMENT
6.3 Conclusion
The modules of controlling the robot is successfully tested and demonstrated. A smart and easy means to guide a robot is achieved using Wi-Fi. Controlling the motion of robot via Wi-Fi is one of the easiest means as it requires the user to access the designated webpage on the android phone to guide it. The live video streaming over Wi-Fi network allows the user to remotely monitor the surrounding environment. We find that this demonstration using Beagle bone, Wi-Fimodule and android smartphone provides a best approach to control the robots and to stream the video on desired display module.

6.4 Future Improvement
Password Protection
Project can be modified in order to password protect the robot so that it can be operated only if correct password is entered

Saturday, 10 May 2014

gprs-pic code for sending latitude and longitude to a specified URL

unsigned char SAT_DATA[57],i;
unsigned char error, byte_read,FLAG=2,FLAG1=0,FLAG2=2;
sbit fin1 at RB0_bit;

sbit shock at RB6_bit;
sbit shock1 at RB7_bit;

unsigned short count=0;

      unsigned char RA=0, BYTE=0,BYTE1;

unsigned char  SITE[]="gprsmanager.orgfree.com/animal/gprsvalue.php?";
//gprsmanager.orgfree.com/animal/gprsvalue.php

  void RX_GPRMS()
{
unsigned char RX=0;
while(RX!='$')
        {
        RX=Soft_UART_Read(&error);   // Read byte, then test error flag
        }
        RX=0;
        while(RX!='G')
        {
        RX=Soft_UART_Read(&error);   // Read byte, then test error flag
        }
        RX=0;
        while(RX!='P')
        {
        RX=Soft_UART_Read(&error);   // Read byte, then test error flag
        }
        RX=0;
        while(RX!='R')
        {
        RX=Soft_UART_Read(&error);   // Read byte, then test error flag
        }
        RX=0;
        while(RX!='M')
        {
        RX=Soft_UART_Read(&error);   // Read byte, then test error flag
        }
        RX=0;
        while(RX!='C')
        {
        RX=Soft_UART_Read(&error);   // Read byte, then test error flag
        }
         RX=0;
}
void RX_DATA()
{
        for(i=0;i<57;i++)
        {
        SAT_DATA[i]=Soft_UART_Read(&error);   // Read byte, then test error flag
        }
}
void TRANSMIT(unsigned char *string)
        {
        while(*string)
        Soft_UART_Write(*string++);
        }

       void SEND_CMD(unsigned char *BASE_ADD,unsigned char COUNT)
 {                unsigned char i;


        for(i=0;i<COUNT;i++)
        {

         UART1_Write(*BASE_ADD);

          BASE_ADD++;

        }
 }
 void ENTER(void)
 {
 UART1_Write(13);
  UART1_Write(10);
 }
  void RX_REPLY()
 {
    RA=0;
        while(RA!='K')
        {
          while(!UART1_Data_Ready());     // If data is received,
      RA= UART1_Read();
     //  UART1_Write(RX);
        }
 }
  void val()
  {
    UART1_Write('"');
  }
     void fin1_angle_0()
     {                                                 //for servo motor at angle 0 degree
      for (count=0;count<30;count++)                //loop
         {
         fin1=1;
         delay_us(582);

        fin1=0;
         delay_us(19418);
                                  // add with 19418us to be 20000us because servo recognizes 20us signals
         }
     }
    void fin1_angle_90()                                  // for s bervo motor at angle 90 degrees
     {
   for (count=0;count<30;count++)
         {
         fin1=1;
         delay_us(1400);

         fin1=0;
         delay_us(18600);

         }
     }

void main() {
TRISB0_bit=0;
TRISB6_bit=0;
TRISB7_bit=0;
PORTB=0X00;
    UART1_Init(9600);               // Initialize UART module at 9600 bps
  Delay_ms(100);
     //UART1_Write_Text("Start");
  //UART1_Write(10);
 // UART1_Write(13);
 error=Soft_UART_Init(&PORTC, 4, 3, 9600, 0);
 Delay_ms(100);
 Soft_UART_Write('r');
  Delay_ms(100);
  fin1_angle_90();
  fin1_angle_0();
  fin1_angle_90();
    fin1_angle_0();
 
    strt:
                Delay_ms(100);
           SEND_CMD("AT",2);
         ENTER();
        Delay_ms(1000);
        SEND_CMD("AT",2);
         ENTER();

        Delay_ms(500);
        SEND_CMD("AT+CMGF=1",9);
         ENTER();

        Delay_ms(1000);
        SEND_CMD("AT+SAPBR=3,1,\"CONTYPE\",\"GPRS\"",29);
        ENTER();
         RX_REPLY();

        Delay_ms(500);
        SEND_CMD("AT+SAPBR=3,1,\"APN\",\"WWW\"",25);
        ENTER();
        RX_REPLY();
        Delay_ms(1000);

        SEND_CMD("AT+SAPBR=1,1",12);
        ENTER();
        BYTE=0;
        while(BYTE!='K')
        {
          while(!UART1_Data_Ready());     // If data is received,
      BYTE = UART1_Read();
      if(BYTE=='E')
      goto strt;
        }
        Delay_ms(500);

        SEND_CMD("AT+HTTPINIT",11);
        ENTER();
        RX_REPLY();
        Delay_ms(500);
        SEND_CMD("AT+HTTPPARA=\"CID\",1",19);
        ENTER();
        RX_REPLY();
          Delay_ms(1000);
          //  SEND_CMD("AT+HTTPPARA=\"URL\",",18);
          //val();
        //SEND_CMD(SITE,41);
 while(1)
 {
 Delay_ms(5000);
   RX_GPRMS();
   RX_DATA();
 /*  TRANSMIT("LAT") ;

  for(i=14;i<26;i++)
          {
         Soft_UART_Write(SAT_DATA[i]);
         }
         TRANSMIT("LOG");
          for(i=27;i<39;i++)
         {
         Soft_UART_Write(SAT_DATA[i]);
         }
          Delay_ms(1000);
          Soft_UART_Write(10);
          Soft_UART_Write(13);
           Delay_ms(1000);  */
              SEND_CMD("AT+HTTPPARA=\"URL\",",18);
          val();
          //val();
        SEND_CMD(SITE,45);
        // TRANSMIT("LAT") ;
        UART1_Write_Text("v1=");
  for(i=14;i<23;i++)
          {
         Soft_UART_Write(SAT_DATA[i]);
           UART1_Write(SAT_DATA[i]);
         }
        // TRANSMIT("LOG");
         UART1_Write('&');
          UART1_Write_Text("v2=");
          for(i=27;i<36;i++)
         {
         Soft_UART_Write(SAT_DATA[i]);
           UART1_Write(SAT_DATA[i]);
         }
        /*  for(i=0;i<57;i++)
        {
        SAT_DATA[i]=Soft_UART_Read(&error);   // Read byte, then test error flag
        UART1_Write(SAT_DATA[i]);
        } */

        val();
          ENTER();
      //    RX_REPLY();


       Delay_ms(1000);
         SEND_CMD("AT+HTTPACTION=0",15);
         ENTER();

       while(BYTE1!='2')
        {
         while(!UART1_Data_Ready());     // If data is received,
      BYTE1 = UART1_Read();

        }
           Delay_ms(2000);
          //  red=0;
        SEND_CMD("AT+HTTPREAD=0,200",17);
         ENTER();

         while(BYTE1!='@')
        {

       while(!UART1_Data_Ready());     // If data is received,
     BYTE1 = UART1_Read();
        if(BYTE1=='K')
        goto END;
        //  UART1_Write(RX);
        }
         while(!UART1_Data_Ready());     // If data is received,
      BYTE1 = UART1_Read();
             if(BYTE1=='S')
       {
   //    RELAY=1;
   if(FLAG2==0)
   {
   FLAG2=1;
shock=1;
  Delay_ms(10);
  shock1=1;
    Delay_ms(10);
     }
     }
        FLAG2=0;
       if(BYTE1=='N')
       {
   //    RELAY=1;
   if(FLAG==0)
   {
   FLAG=1;
   fin1_angle_90();
  // Delay_ms(1000);
     fin1_angle_0();
     }
     }
       if(BYTE1=='F')
       FLAG=0;
     //  RELAY=0;
             Delay_ms(500);


        END:

         Delay_ms(1000);
         }
 


}

gsm-gps code for emergency service project


int i = 0;
int received = 0;
char DataType[] = "GPXXX";
char NMEA[] = "$xxxxx,xxxxxxxxx,xxxx.xxx,x,xxxxx.xxx,x,x,xx,x.x,xxx.x,x,xx.x,x,,*xx";
char receive;
char  error;

  sbit LED at RA0_bit;
           //sbit LED1 at RC2_bit;
          // sbit LED2 at RC3_bit;
char uart_rd;

unsigned char t,add;
unsigned char temp,temp1,temp2;              //


sbit  RELAY at RC0_bit;

//sbit USEN at RB7_bit;
//sbit VIB at RB1_bit;
sbit check2 at RD7_bit;
sbit check1 at RD6_bit;




    unsigned char SAT_DATA[57];

unsigned char pass[5]="1234", pass1[5],compare,fg=0;

 char txt[6];
unsigned int tmp1,tmp2,tmp3,count,count1=0;

      unsigned char SCI_ReceiveByte( void )
        {                                                                                        // RECIVING SERIAL DATA
        unsigned char byte;
        //        RI=0;
       while(!UART1_Data_Ready());     // If data is received,
      byte = UART1_Read();     // read the received data,
        return byte;                                                         // RETURN THE DATA
        }



         void RX_GPRMS()
{
 unsigned char RX=0;
        while(RX!='$')
        {
        RX=SCI_ReceiveByte();
        }

        RX=0;
        while(RX!='G')
        {
        RX=SCI_ReceiveByte();
        }

        RX=0;
        while(RX!='P')
        {
        RX=SCI_ReceiveByte();
        }

        RX=0;
        while(RX!='R')
        {
        RX=SCI_ReceiveByte();
        }

        RX=0;
        while(RX!='M')
        {
        RX=SCI_ReceiveByte();
        }

        RX=0;
        while(RX!='C')
        {
        RX=SCI_ReceiveByte();
        }

         RX=0;

          }

void RX_DATA()
{

        for(i=0;i<57;i++)
        {
        SAT_DATA[i]=SCI_ReceiveByte();
        }
}

 void send_sms1()
 {
 LED=0;
                RELAY=0;
                Delay_ms(500);

                         RX_GPRMS();
                       RX_DATA();
       LED=1;
        RELAY=1;
        Delay_ms(500);

         UART1_write_Text ("AT+CMGS=\"8123902188\"");
         Delay_ms(50);
         UART1_Write(13);
          UART1_Write(10);
          Delay_ms(500);
           Delay_ms(100);

             UART1_write_Text ("LAT");

                        for(i=14;i<26;i++)
          {
         UART1_write(SAT_DATA[i]);
         }
         UART1_write_Text ("LOG");

               for(i=27;i<39;i++)
          {
         UART1_write(SAT_DATA[i]);
         }
          UART1_Write(26);



 }




 void send_sms2()
 {
 LED=0;
                RELAY=0;
                Delay_ms(500);

                         RX_GPRMS();
                       RX_DATA();
       LED=1;
        RELAY=1;
        Delay_ms(500);

         UART1_write_Text ("AT+CMGS=\"9739100797\"");
         Delay_ms(50);
         UART1_Write(13);
          UART1_Write(10);
          Delay_ms(500);
           Delay_ms(100);

             UART1_write_Text ("LAT");

                        for(i=14;i<26;i++)
          {
         UART1_write(SAT_DATA[i]);
         }
         UART1_write_Text ("LOG");

               for(i=27;i<39;i++)
          {
         UART1_write(SAT_DATA[i]);
         }
          UART1_Write(26);



 }





void main() {

 UART1_Init(9600);
 Delay_ms(100);
//TRISD=0X00;
//TRISB=0X00;

TRISA0_bit=0;

TRISC0_bit=0;

TRISD7_bit=1;
TRISD6_bit=1;
//TRISA0_bit=0;

//TRISD7_bit=1;
//check1=1;
LED=1;

     Delay_ms(100);
         RELAY=1;
        Delay_ms(2000);
         UART1_write_Text ("AT");
         Delay_ms(200);
         UART1_Write(13);
          UART1_Write(10);
          Delay_ms(500);

          UART1_write_Text ("AT+CMGF=1");
         Delay_ms(200);
         UART1_Write(13);
          UART1_Write(10);

          UART1_write_Text ("AT+CMGD=1");
         Delay_ms(200);
         UART1_Write(13);
          UART1_Write(10);
          Delay_ms(500);


        RELAY=0;
        check1=0;
        check2=0;
        Delay_ms(1000);
        LED=0;

   while(1) {


if(check1==1)
{
LED=1;
  RELAY=0;
  delay_ms(100);
  send_sms1();
  delay_ms(100);
}
else if(check2==1)
{
LED=1;
RELAY=0;
delay_ms(100);
send_sms2();
delay_ms(100);
}
}
}

Friday, 9 May 2014

INSTALLING MJPG STREAMER ON BEAGLEBONE BLACK:


INSTALLING MJPG STREAMER ON BEAGLEBONE BLACK:


 instructions on getting mjpg-streamer installed on a Beagleboard device is followed with simple commands, are else you can directly download the mjpg streamer folder and u can untar it. With some  trial and error i have put together the following steps that will successfully install mjpg-streamer running Ubuntu 13.04 on a Beaglebone Black.

Now before we install any new software, it is a good idea to update opkg:

:   opkg update

Download mjpg streamer from:

:   wget https://github.com/shrkey/mjpg-streamer/raw/master/mjpg-          streamer.tar.gz

Once we have it, we need to uncompress it to its own directory.by using tar command.

     :  tar -xvf ./mjpg-streamer.tar.gz
     
     COMPILATION STEPS

Do execute following commands in terminal after uncompressing
      
 cd mjpg-streamer

 make

 sudo make install

TO RUN

With a webcam attached to your Beaglebone Black, and whilst still in the mjpg-streamer directory run the following:

Before booting your beaglebone attach your webcam, else bone wont detect it.
To check whether it is attached you can check with the command

:  lsusb

This will list the usb connected devices.

    sudo ./mjpg_streamer -i "./input_uvc.so" -o "./output_http.so -w ./www

you should be in mjpg streamer directry so

:  cd mjpg-steramer

If it gives error like sudo is unknown,you can execute with super user by using the command

:  sudo –s

Even if that fails,you can change the permission of the directory by

: chmod 777 mjpg-streamer 

Now you are ready to run it with

:  ./mjpg_streamer –I “./input_uvc.so” –o “./output_http.so –w ./www 

So your webcam is enabled,to check the video you can see in vlc and even in url 
For webpage:

     : http://192.168.7.2:8080

Give your beagle bone ip address  ,in my case its 192.168.7.2
To check the ipaddress you can use the command

:  ifconfig

For vlc

Go to vlan in the given url copy the address and paste in the vlc network stream.then hit play.you are ready to see the stream of video in vlc.

Wednesday, 19 March 2014

OpenCV 2.4.7 installation on ubuntu.

 http://opencv.org/downloads.html(You can download the newest version of opencv)

To install OpenCV using the terminal on Ubuntu, you will need some dependencies packages first

$ su -

if your password is not validated then use the command

$ sudo -s

before installing any software its preferred to update using apt-get.


# apt-get update

dependencies packages for OpenCV are like


# apt-get install build-essential libavformat-dev x264 v4l-utils ffmpeg libcv2.3 libcvaux2.3 libhighgui2.3 python-opencv opencv-doc libcv-dev libcvaux-dev libhighgui-dev libgtk2.0-dev libjpeg-dev libtiff4-dev libjasper-dev libopenexr-dev cmake python-dev python-numpy python-tk libtbb-dev libeigen2-dev yasm libfaac-dev libopencore-amrnb-dev libopencore-amrwb-dev libtheora-dev libvorbis-dev libxvidcore-dev libx264-dev libqt4-dev libqt4-opengl-dev sphinx-common texlive-latex-extra libv4l-dev libdc1394-22-dev libavcodec-dev libavformat-dev libswscale-dev

OpenCV should be installed in the given path of the  Directory


Download Opencv and untar the package we downloaded


$ wget http://sourceforge.net/projects/opencvlibrary/files/opencv-unix/2.4.7/opencv-2.4.7.tar.gz/download
$ tar -xzvf opencv-2.4.7.tar.gz
$ cd opencv-2.4.7

Now you will need root privileges to compile and install opencv

# mkdir build
# cd build
# cmake -D WITH_TBB=ON -D BUILD_NEW_PYTHON_SUPPORT=ON -D WITH_V4L=ON -D INSTALL_C_EXAMPLES=ON -D INSTALL_PYTHON_EXAMPLES=ON -D BUILD_EXAMPLES=ON -D WITH_QT=ON -D WITH_OPENGL=ON ..
# make
# make install

You may also want to compile and view examples,to view that and to practice you can refer in the path
 /usr/share/doc/opencv-doc/examples .

so now you can work with opencv ;)

Sound Source Localization engineering project

Abstract
The purpose of this project will be to create an auditory system analogous to a pair of
human ears.
In animal auditory systems, the method of determining the location of a sound source
is the Interaural Time Difference (ITD) cue in the auditory cortex of the brain. The ITD
is the time difference between the arrival of the signal at the first ear, and the arrival of
the signal at the second ear. Consequently, this results in an Interaural Phase Difference
(IPD) between the signals at each ear. Assuming the origin of the sound is a point source,
the intensity at some distance R from the source can be determined as
I =
P
πR2
where clearly, we can see that,
I ∝
1
πR2
Since the microphones will be at different radial distances from the sound source, there
should then be distinct level differences between both signals. For this particular project,
the operation of the device can be described as sound source localization. This shall be
accomplished by an arrangement of two microphones placed equal distances from a pivot
point rotated by a servo motor, which we can refer to as the origin. A sound source, placed
some distance in front of the device, will emit a continuous audio signal that can be picked
up by the microphones. When the audio signals reaches a predetermined trigger voltage
(b/w 2.5-5V) at either microphone, then data accumulation begins for both microphone
channels simultaneously. To ensure a precise output we require the time between the
collection of each data point to be sufficiently small (∼10 s). This will allow for more
reliable waveform analysis, thus, more accurately pinpointing the extrema of each input
signal. If we consider the microphone array as the x-axis then the phase shift determined
from the waveforms corresponds to the angular rotation from the positive y-axis, also known
as the azimuth. We then rotate the microphone array through this azimuth; thus orienting
the device to the sound source. Problem we encountered included low and high frequency
input signal noise, Input signal noise was dealt with by restricting the range of frequencies
available for analysis,achieved by implementing the appropriate bandpass filter. Writing
the appropriate algorithm for analyzing input data also proved to be trouble. The actual
device as proposed was not realistic and in its place, a simpler high frequency tracking
device was created, making small adjustments based on comparison of 2 averaged data
samples.



flowchart


hardware



software