Search This Blog

Friday, January 21, 2011

What is gamma?

Gamma: Output vs. Input Level


What is gamma?

Gamma is a numerical value that indicates the relationship between the input and the output signals of a camera/monitor, as illustrated in the graph above. Generally, the more closely a gamma curve approaches a straight line (=1), the more natural the image appears to the human eye. Because monitors have a gamma curve that dips below the straight line, cameras are usually set up with a reverse gamma so that the resultant curve when they are added is close to a straight line. The lower area of the curve affects dark colors while the higher area determines the colors close to white. The middle area represents colors in between. The PMW-EX1 is preset with several gamma curves (STANDARD1-4 and CINE1-4) which can be selected to obtain the desired color balance and contrast. These presets can be modified with the Gamma function to obtain customized color reproduction. In addition, the Black Gamma function allows you to make gamma curve adjustments affect only the darker sections of the image. These gamma adjustments are an effective way to widen the unit’s expressive range.
Back to

knee

KNEE: A function that allows contrast to be reproduced in the bright areas of objects within the video signal's dynamic range (the range of teh luminance that can be processed).

understanding the terms shadows, midtones and highlights

It’s important to know what shadows, mid tones and highlights are in photographic terms when it comes to using Photoshop.  These three elements are present in every photograph, color or black and white, so it’s important to understand how to improve and adjust these settings.   Even if it sounds daunting, read on to find out more about these factors.
Highlights are the lightest areas of an image, therefore the parts that have the most light hitting it.  If something has too many highlights, we may say that it is overexposed and the area is lacking in detail.   Midtones show the middle tones of an image – the colours that are in-between.  For example, if we had a black and white image, the midtone would be grey – somewhere between the two.  You want a good amount of midtone in a balanced image but at the same time, you don’t want everything to be ‘grey’ or flat.  Shadows are the darkest areas of a photograph.  A shadow is also devoid of color; it could be the black in a photograph or just the areas that carry little light.  An image with too many shadows may be underexposed, and will not show much detail, although this can normally be adjusted.

Saturday, January 15, 2011

phase of an audio

If you’ve paid attention for long enough, you’ve seen heated debate in online forums and letters to the editor in magazines. One side will claim that it has been proven that people can’t hear the effects of phase errors in music, and the other is just as adamant that the opposite is true.
Much of the confusion about phase lies with the fact that there are several facets to this issue. Narrow arguments on the subject can be much like the story of the blind men and the elephant—one believes that the animal is snake-like, while another insists that it’s more like a wall. Both sides may be right, as far as their knowledge allows, but both are equally wrong because they’re hampered by a limited understanding of the subject.

What is phase?

Phase is a frequency dependent time delay. If all frequencies in a sound wave (music, for instance), are delayed by the same amount as they pass through a device, we call that device “phase linear.” A digital delay has this characteristic—it simply delays the sound as a whole, without altering the relationships of frequencies to each other. The human ear is insensitive to this kind of phase change of delay, as long as the delay is constant and we don’t have another signal to reference it to. The audio from a CD-player is always delayed due to processing, for instance, but it has no effect on our listening enjoyment.

Relative phase

Now, even if the phase is linear (simply an overall delay), we can easily detect a phase difference if we have a reference. For instance, you can get closer to one of your stereo speakers than the other; even if you use the stereo balance control to even the relative loudness between speakers, it won’t sound the same as being equidistance between them.
Another obvious case is when we have a direct reference to compare to. When you delay music and mix it with the un-delayed version, for instance, it’s easy to hear the effect; short delays cause frequency-dependent cancellation between the two signals, while longer delays result in an obvious echo.
If you connect one of your stereo speakers up backwards, inverting the signal, you’ll get phase cancellation between many harmonic components simultaneously as they cancel in the air. This is particularly noticeable with mono input and at low frequencies, where the distance between the speakers has less effect.

The general case

Having dispensed with linear phase, let’s look at the more general case of phase as a frequency-dependent delay.
Does it seem likely that we could hear the difference between a music signal and the same signal with altered phase?
First, I should point out that phase error, in the real world, is typically constant and affects a group of frequencies, usually by progressive amounts. By “constant”, I mean that the phase error is not moving around, as in the effect a phase shifter device is designed to produce. By “group of frequencies”, I mean that it’s typically not a signal frequency that’s shifted, or unrelated frequencies; phase shift typically “smears” an area of the music spectrum.
Back to the question: Does it seem likely that we could hear the difference between an audio signal and the same signal with altered phase? The answer is… No… and ultimately Yes.
No: The human ear is insensitive to a constant relative phase change in a static waveform. For instance, you cannot here the difference between a steady sawtooth wave (which contains all harmonic frequencies) and a waveform that contains the same harmonic content but with the phase of the harmonics delayed by various (but constant) amounts. The second waveform would not look like a sawtooth on an oscilloscope, but you would not be able to hear the difference. And this is true no matter how ridiculous you get with the phase shifting.
Yes: Dynamically changing waveforms are a different matter. In particular, it’s not only reasonable, but easy to demonstrate (at least under artificially produced conditions) that musical transients (pluck, ding, tap) can be severely damaged by phase shift. Many frequencies of short duration combine to produce a transient, and phase shift smears their time relationship, turning a “tock!” into a “thwock!”.
Because music is a dynamic waveform, the answer has to be “yes”—phase shift can indeed affect the sound. The second part is “how much?” Certainly, that is a tougher question. It depends on the degree or phase error, the area of the spectrum it occupies, and the music itself. Clearly we can tolerate phase shift to a degree. All forms of analog equalization—such as on mixing consoles—impart significant phase shift. It’s probably wise, though, to minimize phase shift where we can.