Author Topic: HELP NEEDED! microphone interfacing  (Read 22462 times)

0 Members and 1 Guest are viewing this topic.

Offline weeramanTopic starter

  • Beginner
  • *
  • Posts: 1
  • Helpful? 0
HELP NEEDED! microphone interfacing
« on: May 18, 2007, 01:10:27 AM »
HI,
I'm an undergraduate from Sri Lanka in engineering and currently involved in a robotic project.
This project is about making a care taker robot which will find it's master by sound source localization.

We are intending to build an autonomous robot with PC motherboard on it for processing.
For voice localization we need to connect four microphones to the mother board.
For that we need an advice or a way,
If anybody interested please reply,

Rgds
Nilanka

Offline Somchaya

  • Robot Overlord
  • ****
  • Posts: 180
  • Helpful? 0
  • You know it's cute!
Re: HELP NEEDED! microphone interfacing
« Reply #1 on: May 18, 2007, 01:34:14 AM »
Hmmm, if you have very accurate timing, you can determine the time difference between hearing the sounds on each of the microphones, and from that you can triangulate the position of the source. That's kind of how our ears do it.

Alternatively, you can work on volume, and turn such that the left/right microphones receive similar volume. Something like that can probably be extended to 4 microphones..
Somchaya - Back after a year of misc things
http://whisker.scribblewiki.com

Offline dunk

  • Expert Roboticist
  • Supreme Robot
  • *****
  • Posts: 1,086
  • Helpful? 21
    • dunk's robot
Re: HELP NEEDED! microphone interfacing
« Reply #2 on: May 18, 2007, 03:41:04 AM »
for the hardware i'd recommend 4 USB sound cards.
(far easier to use commodity hardware than design your own.)

dunk.

Offline Admin

  • Administrator
  • Supreme Robot
  • *****
  • Posts: 11,703
  • Helpful? 173
    • Society of Robots
Re: HELP NEEDED! microphone interfacing
« Reply #3 on: May 18, 2007, 07:35:42 AM »
These might help for ideas:
http://www.societyofrobots.com/robotforum/index.php?topic=369.msg2313#msg2313
http://www.societyofrobots.com/robotforum/index.php?topic=283.0

Quote
Hmmm, if you have very accurate timing, you can determine the time difference between hearing the sounds on each of the microphones, and from that you can triangulate the position of the source. That's kind of how our ears do it.
Actually our ears work on volume differences - our thick skulls block sound between both our ears to enhance this difference. :P
Some animals change their ear shape to locate sound, like dogs or bats.

The other issue with a clock is that it needs to be extremely high precision - especially at short distances. And super fast clocks are wasteful energy wise . . . the first step to take if you plan to do the timing method is to use the expected distance and the speed of sound to calculate how fast a clock you need to have a decent amount of precision. Knowing your clock speed will help you choose a processor.

Offline JonHylands

  • Expert Roboticist
  • Supreme Robot
  • *****
  • Posts: 562
  • Helpful? 3
  • Robot Builder/ Software Developer
    • Jon's Place
Re: HELP NEEDED! microphone interfacing
« Reply #4 on: May 18, 2007, 08:17:53 AM »
The clock doesn't have to be that high of a precision.

Speed of sound is what, around 1000 feet per second? That's 1 foot per millisecond, or 1 inch per 83 microseconds.

An AVR microcontroller running at, say, 20 MHz can easily do timing at the microsecond level, so you could quite readily measure down to say 1/8" difference in time waves arriving (would be around 10 microseconds, which is 200 instructions at that clock rate).

Its certainly not a trivial thing to do, although there was an article in the last year or two in one of the robot magazines on a project that did it with two microphones.

- Jon

Offline Admin

  • Administrator
  • Supreme Robot
  • *****
  • Posts: 11,703
  • Helpful? 173
    • Society of Robots
Re: HELP NEEDED! microphone interfacing
« Reply #5 on: May 18, 2007, 01:41:49 PM »
just ran into today this without even looking for it:
http://instruct1.cit.cornell.edu/courses/ee476/FinalProjects/s2007/ai45_hkc2_sbs43/ai45_hkc2_sbs43/index.html

it has circuits, diagrams, equations, pics, and even a parts list




Offline Rolf

  • Jr. Member
  • **
  • Posts: 11
  • Helpful? 0
Re: HELP NEEDED! microphone interfacing
« Reply #6 on: May 18, 2007, 04:07:36 PM »
Quote
Actually our ears work on volume differences - our thick skulls block sound between both our ears to enhance this difference. Tongue
Some animals change their ear shape to locate sound, like dogs or bats.

I know its a little of topic, but how does our ears know if a sound is comming from behind, or in front of you?

Offline Admin

  • Administrator
  • Supreme Robot
  • *****
  • Posts: 11,703
  • Helpful? 173
    • Society of Robots
Re: HELP NEEDED! microphone interfacing
« Reply #7 on: May 18, 2007, 04:27:15 PM »
Quote
I know its a little of topic, but how does our ears know if a sound is comming from behind, or in front of you?
You know, Im not really sure . . . Ive been thinking about this a lot since last summer, too . . . if someone knows please tell us!

For now, Ill offer my theories that potentially explain this . . .

So the issue is how do we triangulate sound when we only have two ears?

The first is ear shape. Ear shape plays a large part in how we hear sounds, and many animals even have the ability to change ear shape to 'focus' on a particular sound. Ear shape is still very much an active area of research and so its still not fully understood . . . Think about it, sound entering our odd-shaped ears at different directions could 'sound' very differently. Our brain could potentially interperlate where the sound comes from by how it sounds.

The next theory is head movement. Ever notice you moving your head to better identify a sound? Two ears, moved to another location, is equivalent to four ears. But of course this requires the sound to be constant when you move your head . . . Since your ears have a special shape, the sound could be 'expected' to change in a particular way when moving your head in a particular way . . . possible at least . . .

And my last theory is that even though you have two ears, its not just two points in space. Each ear has thousands of hair sensors distributed over an area - hence at least three points for triangulation.

But alas these are unproven theories . . .

Offline dunk

  • Expert Roboticist
  • Supreme Robot
  • *****
  • Posts: 1,086
  • Helpful? 21
    • dunk's robot
Re: HELP NEEDED! microphone interfacing
« Reply #8 on: May 19, 2007, 05:41:38 AM »
something to bear in mind is that human hearing isn't only triangulating by the difference in volume and the difference in timing but also the difference in phase.
as usual the brain is very good at mixing input from lots of different sources and providing us with an idea about which direction.

high frequency noises are easyest to determine the direction of as high frequency sound does not refract (bend) round corners or echo much.
if you hear a high frequency noise your brain can therefore trust it is closest to the ear that hears it the loudest.

short sounds (for example, a gun shot) are easier to determine than constant drones as your brain can look for the leading edge and trust the noise will be closest to the ear that hears it first.

constant, low pitch noise is hardest to determine as your brain accepts it could be reflected off something, refracted round your head so both ears hear it equally well and because there is little to distinguish it over time your brain can't really tell which ear is receiving the noise first.
in cases like this your brain uses the phase difference to get the timing information. it's more processor intensive though, hence you really having to concentrate on which direction low pitch noise is coming from.
as an example, next time you hear a fog horn in the mist, try working out where it's coming from. you can tell but you have to concentrate hard.

although all this explains how you can tell how far off center a noise is, none of it answers the question though:
Quote
I know its a little of topic, but how does our ears know if a sound is comming from behind, or in front of you?
i remember reading somewhere that it's to do with the shape of the ears. sounds in front of you will appear crisper than sounds behind you.
i think if i had to design the hardware i'd give us 3 ears. or 4 ears if i was concerned about scary things above us.

dunk.
(who is a little deaf to midrange frequencies so has put a lot of thought into this over the years.)

Offline Rolf

  • Jr. Member
  • **
  • Posts: 11
  • Helpful? 0
Re: HELP NEEDED! microphone interfacing
« Reply #9 on: May 19, 2007, 07:27:55 AM »
I did a little experiment with two of my friends, a few years ago. One person sat on a chair with hes hands covering hes eyes. One person stood in front of him, and the other stood behind him. We then did different short sounds, like clapping our hands and ringing a bell. We figured the person in the middle could tell where sound was comming from about four out of five times. And he was not moving hes head (the sounds were to short for that). Just wanted to tell you  :)

Offline ishka

  • Jr. Member
  • **
  • Posts: 38
  • Helpful? 0
    • Unifestival
Re: HELP NEEDED! microphone interfacing
« Reply #10 on: June 16, 2008, 03:28:37 PM »
Well, long time posted but as admin refered it today, I'm gonna give my 2 cents about human sounds localization. Dunk was the closest to the truth about it btw :D

First talk about general human ears capacity. Our ears got 1 special system ( if interested pm me) to recognize the frequency of the sound. It then send the input to brain, roughly each frequency for 20Hz to 20KHz is coded by one pack of neurons, so when they activate the brain know wich frequency it hears.
Now, how do we know where the sound come from?
Split possible origin into 2 main cat. : the ones that comes from vertical plan, and the ones that comes from the horizontal one.
First about the vertical, as is it the easiest to explain : we don't know. :P The accepted theory until better comes to mind is that vertical sounds reflects on the external part of the ear, and brain can notice the difference between direct and reflected sound. We get to this theory by noticing that ppl who don't have external ears ( because they get burn or something else) can't detect origin of vertical source' sounds. You can reproduce it by puting a tube in your ear ( take care not to hurt)

Now about horizontal. 2 category of sounds (roughly) : high (2k=>20kHz), and low frequencies (20=>2kHz).
For the low frequency ones, brain detect the time between a specific phase of the sound, and so can tell where it comes from. As for low frequency the wave length is larger than head we never got problem to localize short low frequency sounds this way. But what about continous sounds ? Ofc, you can't tell wich ear hear it first as it is continous. Here the brain detects the delay between one of the phase of the sound.
For the high frequency ones, this obviously won't work, as the wave length is by far shorter than head. But high frequency sounds find head as an obstacle and so will reflect on it. Result is that the intensity is different between each ear, and so you can detect the origin of the sound. This works for both short and continous sound.

Maximun horizontal resolution is 2°, means that brain can detect interaural delay of 0,011ms  ( !!!)
The interraural delay for a sound that come from side of the head is ~0,6ms.
The world champion of interraural delay analysing is the bat, wich can detect delay of 0,00001ms ( no typo here : it's really 10^-8 sec). Scientist wonder how they do so, as the signal length sends to brain is min 1ms.
May the Achtuche be with you...

 


Get Your Ad Here

data_list