Is there a device like this (or is it even possible?)

  • Thread starter Thread starter minofifa
  • Start date Start date
minofifa

minofifa

New member
I was just thinking....

I have noticed how everyboy is concerned with keeping their dat in the "digital domain" and not wanting to degrade their audio by going though A/D and D/A converstions. So i was thinking; whould it be possible to construct a microphone that was digital? From my understanding (which probably isn't worth much) a normal mic works by sensing the air pressure and transforming that to an electrical (analog) signal. This signal can then go though an A/D converter to become sampled as digital. Well is there a way to make a mic that can sense air pressure and transform that into a digital signal?

Here's my analogy:

Back in the day, if we wanted our pictures on the computer, we could take a picture that was developed from film (representing the analog signal) and run it thogh a scanner (representing the A/D converter). Today we can buy a digital camera which makes digital pictures. the digital camera would be like the digital mic. Is something like this available or even feasable? or is my thinking way of base?
 
There are several digital mics that I am aware of. One problem is allowing room for the A/D conversion in the mic body. Neumann has one and maybe Audix? They are frightfully expensive, which will probably change in the next few years. But there is no "purely" digital mic: no matter what you do, sound originates in the analog domain and has to be converted to digital at some point: the digital mics merely move that point closer to the capsule. The analogy of digital cameras is not entirely accurate, since they use CCLs to capture the information making up the image. In other words, the light entering from the lens falls on an array of digital light sensors and the image is assembled from them. I don't know of any equivalent "digital sound sensors" out there. The sound is still captured by an acoustic capsule, and then must be converted.
If you're interested, here's a link: http://www.national.com/nationaledge/apr03/article.html
 
In the camera analogy, I would call the lens the diaphragm and the CCD the DAC. In other words, the lense converts light into an image, and the CCD samples that image, like the diaphragm converts sound waves into an electric signal, and the DAC samples that signal.

There is no such thing as a digital signal, really. Anything digital is really just a sample of the real thing. Like in statistics, when the take a sample of what 500 people watch on TV and use that to estimate what a million people are watching. Digital technology - sound, images, video, etc - is the same thing; it takes a small sample of the "real" thing, and uses that to represent the entire thing. Just as statistics get more accurate when you take a larger sample size, so does digital. A 5 megapixel camera is more accurate than a 2 megapixel camera. A 24-bit sample is more accurate than a 16-bit sample. Taking 96,000 samples per second is more accurate than taking 44,100 samples per second.

The theory is that at some point, you have so many samples of a thing that a human being can't tell the difference between the real thing and your digital representation.
 
dirtythermos said:
In the camera analogy, I would call the lens the diaphragm and the CCD the DAC.

Cameras have diaphragms :p

Also in the analogy, the CCD is the ADC not DAC.

There could be digital sensors; somebody here built a mic using LED sensors, direct digital conversion of that output could be very similar to a CCD. I suppose there has been no compelling advantage to that approach.
 
ahh i see. thanks for the info.

The theory is that at some point, you have so many samples of a thing that a human being can't tell the difference between the real thing and your digital representation.

what is that limit in digital sample terms. What are we up to right now.... 24 bit and 196 khz or something like that? Do you think there will ever be a point where manufactures say that enough? there is no point to having something sample higher because human beings can't tell the difference anymore.

Also in terms of analog, when a mic changes that air pressure (or acoustical information i will call it) into an electric signal, is it not sampling in a way? that transfer cannot be infinitely accureit or else why would one mic be better than another right?

In other words, if you could compare an ananlog representation of a real sound to a digital represantation, what would they look like? is analog far more accurate than digital?
 
There's a claim that the resolution of the human ear is 27 bits!
But the limit of available digital gear will depend on the availability of hard drive (or the future equivalent) space.

Also, my bad in calling CCDs CCLs. I guess my medication needs adjusting.

You're right that a diaphragm could be counted as a sampling device, but there's no associated protocol assigning bit depth or sampling rate to it so I guess it'll remain off the mainstream of thought.

And for what it's worth, Sweetwater will sell you a Neumann Condensor-D digital mic for $6600! You'll probably want to pick up a pair, at that price.
 
minofifa said:
what is that limit in digital sample terms. What are we up to right now.... 24 bit and 196 khz or something like that? Do you think there will ever be a point where manufactures say that enough? there is no point to having something sample higher because human beings can't tell the difference anymore.

Neil Young will always hear the difference. :) As long as digital storage keeps getting cheaper and processors keep getting faster, they will probably keep raising it. More bits are helpful because they let you do more processing.

Also in terms of analog, when a mic changes that air pressure (or acoustical information i will call it) into an electric signal, is it not sampling in a way? that transfer cannot be infinitely accureit or else why would one mic be better than another right?

In a way, yes. Just like if you keep enlarging a photograph from film, it will eventually get grainy. Better film can be enlarged more before getting grainy, like better components make a better mic.

But if you enlarge a digital photograph, you will very soon start to see the individual pixels. These pixels are the samples. There is nothing between them, even though in real life there was a beam of light between them.

With analog film, you will never see pixels. With "perfect" film, you could enlarge forever and never see a loss in quality. Of course, perfect film doesn't exist just like a perfect microphone doesn't exist, because of the physical limitations of the media.

In other words, if you could compare an ananlog representation of a real sound to a digital represantation, what would they look like? is analog far more accurate than digital?

Think about a very old, Thomas Edison era, record. Sound quality sucks, right, even though it's analog. Same with cassette tape. Analog is only as accurate as the quality of the media. Digital media is "perfect," in the sense that what you put on it is exactly what you get off of it. But the quality of what you put on it is dependant on the resolution of your samples (and, in the case of audio and video, the frequency of your samples).
 
yes that makes a lot sense...

I wonder why the industry didn't focuas on making better media for analog, rather than pushing towards digital. I suppose digital has its advantages though, like the ability to process it easier. software i would assume has greatly enhanced the ability to manipulate wavforms. If nothig else, doing a lot of tasks is quicker.

With analog film, you will never see pixels. With "perfect" film, you could enlarge forever and never see a loss in quality. Of course, perfect film doesn't exist just like a perfect microphone doesn't exist, because of the physical limitations of the media.

well in the case of the camera and the physical limitations of the film, do you think that companies will ever make a camera that will have a high enough megapixel rating that will produce an image just as good as that analog film?

In the case of audio, do you think that we will ever be able to sample in depth enough (bit wise) and fast enough (sample frequency wise) to produce a recording that is equally as good as (because there would be no way to get a better recording due to the limitations of the ananlog mic) an analog signal?
 
Well, not that this is reality or practical or anything, but I can imagine a digital microphone. That is, a microphone which directly converts changes in sound pressure into a quantized output. Here's one idea....use an array of MEMS pressure sensors which are set up such that their output is modulated by a sound pressure threshold. They would be set up with stepwise increases in pressure thresholds such that at a given SPL, only those with the threshold at or below that SPL would be 'on'. The more sensors you have the more steps you can get and thus set up to directly correlate with the desired word length.

Of course, you'd need millions of them and it would be infinitely easier to just use one and convert the output.
 
minofifa said:
well in the case of the camera and the physical limitations of the film, do you think that companies will ever make a camera that will have a high enough megapixel rating that will produce an image just as good as that analog film?

In the case of audio, do you think that we will ever be able to sample in depth enough (bit wise) and fast enough (sample frequency wise) to produce a recording that is equally as good as (because there would be no way to get a better recording due to the limitations of the ananlog mic) an analog signal?

Film has a theoretical color resolution of 100 pixels per mm, which translated to 9Mpixels on a piece of 35mm film. So in terms of matching a film camera in the consumer market, digital is already there. It gets trickier matching medium and large format, but only because large CCDs are so expensive.

I also believe that digital audio can already exceed human resolution. The limit of human hearing is 20kHz, the limit of dynamic range is about 150 dB before your eardrums rupture. That translates to 25 bit/40kHz sampling, with a bit added on the sample rate to avoid ill effects of anti-aliasing filters. Of course 24 bit becomes somewhat unnecessary since most drivers can't generate SPLs anywhere near 150dB without destroying themselves. Plus most listeners don't have megawatt power amps :)

I haven't seen the scientific basis for 24/96kHz demonstrated yet, although that doesn't mean there is none. However, reports of improved sound seem to be mainly anecdotal.
 
minofifa said:
In the case of audio, do you think that we will ever be able to sample in depth enough (bit wise) and fast enough (sample frequency wise) to produce a recording that is equally as good as (because there would be no way to get a better recording due to the limitations of the ananlog mic) an analog signal?

'Good' is a pretty ambiguous term, but in terms of accuracy as it pertains to frequency response and dynamic range, I believe digital is already there. Whether or not is sounds as 'good' is up to you.
 
that is true that the human sensese can hear up to abou 20 khz so by the nyquest therum we have to sample at at least 2 times that so 40khz. I think the reason we sample at much higher rates though is to capture the high frequency harmonics thar are part of the notes we hear. Say you record a middle C on a piano, well if we sampled at 40khz, we could only get the harmonics that reached 20khz before they stared to fold over on themselves (alias). So if we sample at 90+khz we can get more of the higher harmonics before they start to alias.

As for the bit depth, ya 24 bit seems to be the height of excessiveness. I don't think we will need to get much louder in decibals or the ears will burst. Maybe for accuracy in processing alone, a higher bit depth would be better.
 
minofifa said:
Do you think there will ever be a point where manufactures say that enough? there is no point to having something sample higher because human beings can't tell the difference anymore.
Already people can't tell the difference, but they're still willing to pay for it. Manufacturers will say "that's enough" when people won't pay for it.
dirtythermos said:
In a way, yes. Just like if you keep enlarging a photograph from film, it will eventually get grainy. Better film can be enlarged more before getting grainy, like better components make a better mic.

But if you enlarge a digital photograph, you will very soon start to see the individual pixels. These pixels are the samples. There is nothing between them, even though in real life there was a beam of light between them.

With analog film, you will never see pixels. With "perfect" film, you could enlarge forever and never see a loss in quality. Of course, perfect film doesn't exist just like a perfect microphone doesn't exist, because of the physical limitations of the media.
Digital photography will soon be able to surpass the quality of film photography in terms of image resolution, if it hasn't already. There are also limitations imposed by lens aberrations, atmospheric turbulence (in the case of long focal length photography such as astrophotography), and finally the wavelength of light and aperture of the lens impose a diffraction limited resolution.

What this has to do with a digital mic escapes me at the moment - I'll need to give it some thought. :D

I think digital sound manipulation has already far surpassed the resolution and fidelty of analog sound processing - it's just that some analog equipment adds a kind of distortion to the sound that many of us find pleasing. We buy different mics because of the different types of distortion they add to the sound. A digital mic, assuming a digital transducer of some type (that is, one that converts sound pressure levels directly to a digital bit stream) might not sound so pleasing to our ears.
 
The biggest problem I have with digital media (photography, or audio) is longevity. I have glass plate negatives dating from the civil war that are still usable. (And no, I'm NOT that old!). Remember the consumer digital recorders that used a cassette? Try to play those back. JPGs? What happens when the HD they're stored on goes down? And it will go down.
 
I guess I'm failing to see the point of a digital microphone.

What happens when the technology advances and goes to 64 or 128 bit, and you're still using a 24-bit mic???? You gonna call Behringer for an upgrade kit???

Besides that, with a digital mic, you're gonna be stuck usiing either digital media exclusively, OR having to D/A back to analog to use tape or wax cylinders or whatever.

So what's the point???

I mean, the most frequently asked question in the entire HR universe is:

What's the best mic under $100???
 
Harvey Gerst said:
The biggest problem I have with digital media (photography, or audio) is longevity. I have glass plate negatives dating from the civil war that are still usable. (And no, I'm NOT that old!). Remember the consumer digital recorders that used a cassette? Try to play those back. JPGs? What happens when the HD they're stored on goes down? And it will go down.
Well, of course you archive all your digital media in an analog format. :D
Even though it still won't last very long in the grand scheme of things - I'm happy if the stuff I make will last until I'm dead (which won't be very long now.)
 
Print out everything in binary on acid-free paper and put in a vacuum vault :D .
 

Similar threads

Back
Top