Author Topic: Biomimetic "Fly Eye" sensors  (Read 3815 times)

0 Members and 1 Guest are viewing this topic.

Offline gavin.philipsTopic starter

  • Beginner
  • *
  • Posts: 3
  • Helpful? 0
Biomimetic "Fly Eye" sensors
« on: May 01, 2008, 05:10:22 PM »
Hello all,

I just recently discovered your website, and have found it to be extremely helpful!  I am currently finishing my master's thesis in electrical engineering at the University of Wyoming, specializing in assistive technology (specifically smart wheelchair systems), so the great information on wide varieties of sensors and control algorithms is just marvelous.

Anyway, I was reading the section on machine vision, and noticed brief mentions of biomimetic design and compound eyes in nature.  I thought I might let you know about some interesting work going on here at UW in that area.

My advisors, and most of the other graduate students in my research group, have been working for a long time to develop a sensor based on the compound eye of the common house fly.  They are currently testing hardware prototypes, and have been seeing amazing results.  The sensors have hyperacuity (due to overlapping Gaussian response of the adjacent photoreceptors), so they can see motion without it crossing the usual pixel boundaries of CCD arrays.  The sensors have proven able to detect a black wire moving against a black background in a dark room.

These sensors seem perfect for applications like finding tripwires, tracking hanging power lines for UAVs, and anything that involves tracking movement and finding shapes/lines/landmarks.

The home page for our research group (the WISPR lab) can be found at http://wwweng.uwyo.edu/electrical/research/wispr/.  If you navigate to the research or publication pages, you can find more information on the fly eye work, some really interesting swarm robot research involving beautifully simple physicomimetics, and several other research topics.

I don't mean to come across as an advertisement or anything like that; I just thought you might like to hear about some interesting work, and I know how these sorts of creative people take ideas and run with them.

Thanks for your time!

Gavin

Offline gavin.philipsTopic starter

  • Beginner
  • *
  • Posts: 3
  • Helpful? 0
Re: Biomimetic "Fly Eye" sensors
« Reply #1 on: May 01, 2008, 05:13:58 PM »
P.S.  There will be no way to miss all of the references to this site in my thesis.  =P

Offline hgordon

  • Expert Roboticist
  • Supreme Robot
  • *****
  • Posts: 373
  • Helpful? 7
    • Surveyor Robotics Journal
Re: Biomimetic "Fly Eye" sensors
« Reply #2 on: May 02, 2008, 03:16:06 PM »
The fly eye looks cool.  I didn't find any technical specs in a quck look at the site, so you could provide a brief description ?  Is this composed of multiple lenses focusing light on a single array, or are there multiple arrays ?

Also, what's the sensor configuration on the Maxelbots ?
Surveyor Corporation
  www.surveyor.com

Offline Admin

  • Administrator
  • Supreme Robot
  • *****
  • Posts: 11,703
  • Helpful? 173
    • Society of Robots
Re: Biomimetic "Fly Eye" sensors
« Reply #3 on: May 03, 2008, 12:54:46 PM »
Interesting!

How big is it? How much processing power does it require? I/O requirements? Got pics?

I'd be willing to work with you to get it tested on a real robot - if you can get it the same size as a normal camera, there would be huge apps for this as you said.

ps - thanks :)

Offline gavin.philipsTopic starter

  • Beginner
  • *
  • Posts: 3
  • Helpful? 0
Re: Biomimetic "Fly Eye" sensors
« Reply #4 on: May 05, 2008, 03:16:54 PM »
I worked mostly on software when I was on the project, and now I am working specifically on application to smart wheelchairs, so I'm not the best person to answer your questions.  I'll try though.

A paper was just published on the topic, and is available free of cost for 30 days after publication at http://www.iop.org/EJ/abstract/1748-3190/3/2/026003/.

The fly eye actually uses ball lenses connected through fibers to individual photoreceptors.  Then, each cartridge of seven photoreceptors shares a common lens.  Multiple cartridges can be combined to form the 49-photoreceptor sensor here:







John Benson was nice enough to share these images with me, as they are from his dissertation work.


As far as interfacing, each photoreceptor can produce a single voltage.  We can subtract the outputs from two photoreceptors in a line and find an almost linear voltage that corresponds to the position of a line in its field of view.  I don't know if I explained that very well...  I guess to put it simply, one 8-channel A-to-D converter should be sufficient for one seven-photoreceptor cartridge.  I believe John may be working on improvements to the interface as well.


As for the Maxelbots, they are part of the work being done by Drs. Bill and Diana Spears, of our COSC department.  If I remember correctly, they use sonar (Devantech SRF04s and something else) and radio to communicate with each other, and a simple physicomimetics routine to keep in formation.  As you change the coefficients of the physics equations, they fall into different lattice formations, which is really interesting to watch.  In order to produce the ultrasonic signals in a plane, I *think* they had custom reflective cones made.  They aim the sonar transceivers straight down onto the cones, which then disperse and collect the signals in a plane at a certain level.  I believe they use 3 of these cones, one radio transceiver, and a trilateration scheme in order to track the position and orientation of each robot with respect to the others.

There are a lot of papers and video available on their swarm research, Maxelbots, etc. at http://www.cs.uwyo.edu/~wspears/maxelbot/.  If you look closely in the videos, you can see the three cones and sonar sensors on each maxelbot.  I wish I could remember what they used for obstacle avoidance, but the best I can tell you offhand is that they have been using some very common sonar and infrared sensors in most of their work.

Offline hgordon

  • Expert Roboticist
  • Supreme Robot
  • *****
  • Posts: 373
  • Helpful? 7
    • Surveyor Robotics Journal
Re: Biomimetic "Fly Eye" sensors
« Reply #5 on: May 05, 2008, 07:05:26 PM »
Thanks for the info.  That's nice work !
Surveyor Corporation
  www.surveyor.com

Offline williammspears

  • Beginner
  • *
  • Posts: 4
  • Helpful? 0
    • Swarmotics, LLC
Re: Biomimetic "Fly Eye" sensors
« Reply #6 on: December 10, 2008, 12:30:24 AM »
There are a lot of papers and video available on their swarm research, Maxelbots, etc. at http://www.cs.uwyo.edu/~wspears/maxelbot/.  If you look closely in the videos, you can see the three cones and sonar sensors on each maxelbot.  I wish I could remember what they used for obstacle avoidance, but the best I can tell you offhand is that they have been using some very common sonar and infrared sensors in most of their work.

      We have quite a few videos at YouTube also, explaining how we do trilateration. We have an I2C
      architecture, so it isn't hard to plug in more sensors. Gavin is correct. We use Sharp GP2D12 sensors
      for detecting objects. They have a very narrow beam, and work via triangulation. One nice aspect
      is that they detect non-reflective objects very well, despite their angle w/r the sensor. Shiny objects
      are a problem, however. We have built an "Obstacle Detection Module" that can handle as many as
      eight Sharp sensors, but in practice we use three to five.

      I have a note that the acoustic transducers are at the 40khz range and are called "400ST/R160".
      See: http://www.prowave.com.tw/english/products/ut/open-type/400s160.htm

As for the Maxelbots, they are part of the work being done by Drs. Bill and Diana Spears, of our COSC department.  If I remember correctly, they use sonar (Devantech SRF04s and something else) and radio to communicate with each other, and a simple physicomimetics routine to keep in formation.  As you change the coefficients of the physics equations, they fall into different lattice formations, which is really interesting to watch.  In order to produce the ultrasonic signals in a plane, I *think* they had custom reflective cones made.  They aim the sonar transceivers straight down onto the cones, which then disperse and collect the signals in a plane at a certain level.  I believe they use 3 of these cones, one radio transceiver, and a trilateration scheme in order to track the position and orientation of each robot with respect to the others.

      Gavin is again correct. Our XSRF (Extended Sensor Range Finder) boards are improved versions of the
      Devantechs (with more op amps and bandpass filters etc, and other mods). The cones are parabolic in
      shape, and the transducer is at the focal point. Hence the acoustic energy is spread into the plane. When
      acoustic energy is received, the cones act as focus mechanisms, making it easier to hear the energy.

      One Abacom AM transceiver is used, with 3 acoustic transducers/XSRF boards/cones. When one robot
      provides an RF and acoustic "ping", the other robots can figure out where the pinging robot is, in their
      own local coordinate systems (using trilateration, as stated).

      So, that is the "enabling hardware technology" that we use. The "enabling software technology" is
      "physicomimetics", which inherently has excellent scalability, robustness, resistance to noise, etc. It
      is the reason that our transitions from simulations to actual robots have not been difficult.

      If there is further interest in this project, or if you have questions, please contact me at
      [email protected]

      Thanks! Bill

      PS. A disclosure note - I was on Gavin's M.S. committee and think he's a great guy, so I'm biased. ;-)
      He did some very good work, as you can tell.

 


Get Your Ad Here