Wavelets - The Plain English variety

Heathen

Regular
Ok I'm a newbie here.

Now I noticed in one of the many conversations which abound this nirvana of good intentions, :LOL:, a things referred to as a wavelet.

Ok somebody please explain what a wavelet is, what use is it etc etc (in simple terms if you don't mind) as all I keep seeing is baby waves.
 
IIRC this has to do with the behavior of particles that have the properties of both matter and energy...namely light in this case. There are all sorts of behaviors that these particles exhibit that necessitated this branch of study, I believe termed collectively under "wavelet theory", and in the context of this board I believe the application is the ways people have come up with based on these theories to simulate the behavior of light effectively.

That's what I come up with when I hear "wavelet", but I'm not a 3d guy, just a former nuclear reactor operator, so my viewpoint may be a bit skewed...also, it has been a few years since I studied that, so maybe I'm even more off. :-?

EDIT: FYI, matter and energy are "the same thing" exhibiting different behavior characteristics...it is possible to have one behaving as you'd expect the other to behave under the right extremes. My disclaimer about it being a "few years" still applies here though. :LOL:

EDIT2: An important concept I didn't get across is "statistics" or "likelihood", which was the crux of the application of this to my former profession, and likely pretty relevant to how it is applied to simulate the behavior of light, but I'm at a loss on how to make the connection firmly with my current level of recall of the terms necessary....I guess wait for a 3d guy/gal to come along and clarify. ;)
 
Errr... wavelets have to do with frequency, high and low frequencies in signals and images. Mainly referred to in terms of wavelet based compression which splits an image into a high and low frequency part, downsamples both and stores them, IIRC the low frequency one is then again split into a high and low frequency component, again downsampled and you repeat the whole thing. You kinda get a MipMap tree which splits into low and high frequency components. So the lowest level is 1 by 1 and represents the average of all pixels in the original image. By adding in the high frequency component you can recover the more detailed levels. Compression is achieved by storing the different levels with different amounts of bits.

Now err.. don't ask me why this is called wavelets though and all of this is a looong time ago for me :) (Not that I am that old)

K~
 
Hmm...I'm pretty sure we used wavelet theory to analyze for example the behavior of photons and passage through a narrow aperture. Bleh, well, my "been a few years" disclaimer was there for a reason. :-?

EDIT: It srikes me we may be talking for two different applications of the same principles, as your explanation doesn't seem to exclude an application to what I'm thinking of.
 
Waveys

Well I'm more aware of the 'traditional' view of wavelets (some interesting developments afoot with entanglement I believe), just wondering how it all translated into computer graphics speaks.

Well cheers all.
 
Probably no good for simple explanations, but this webpage has a fantastic array of links for wavelet theory and associated uses:

http://www.mathsoft.com/wavelets.html

Scroll down a bit for the links to do with images and computing.

The best explanation I can give (and have used in maths lessons I've given) for wavelets are that they are a group of functions (equations, formulae) that can be used to approximate another function - the wavelets have the useful feature of being able to scale the part of the original function it's looking at and retain detail. For example, a complex but noisy sound wave can be analysed well using wavelets but not so with other methods (eg. Fourier analysis). As you can see in the link above, wavelet theory gets used just about everywhere.

Sorry if it's a poor explanation - my subject is physics and I only drag myself into maths if I have to. ;)
 
It's just a variation of signal analysis. If you took college calculus/numerical analysis, you are probably familiar with the power/taylor series representation, and the fourier transform or laplace transform. It's similar, but with different features.

In Fourier analysis, you replace a function in the time domain f(t) with an infinite sum of harmonic functions. Any function can be represented by a sum of sin/cos functions. The result is that you can see the frequencies that make up the original function.

For example, if I take a sample of your voice for 5 seconds, I can decompose it into a sum of frequencies representation and plot it. I will see that most of the energy of your voice is distributed in the 3-8Khz region.

However, the Fourier transform loses the aspect of "locality" and time. It is a global analysis of the signal. I can see the frequency spectrum of your voice for the whole 5 seconds, but I don't really see what happened in detail between 1.203 seconds and 1.208 seconds.

To get around this, people tried "windowing" the Fourier series. For example, I could compute the transform between 0-1seconds, 1-2 seconds, 2-3 seconds, 3-4 seconds, and 4-5 seconds, and then combine them. However, windowing will introduce errors because the Fourier basis functions (sine and cosine) are infinite in extent.

The goal then is to come up with basis functions that evaluate close to zero outside the "window"

Wavelets are designed to solve these problems. Wavelets introduce the notion of "scale" so that now you transform the original signal to frequency representation, but you can choose the scale (resolution) you do it at. The wavelet basis functions (of which there are many) deal much better with windowing errors.

These two features (error tolerant windowing and multiscale analysis) make wavelets great at image compression and noise reduction.
 
Thanks for that. Wavelets appeared to have come into vogue after I got out of school and I was always wondering exactly what they are, but was too lazy to look it up.

Now I can sound educated at geek parties. ;)
 
I worked with wavelets for texture compression because it delivered mipmaps for free. The trick were the filters to create the high/low pass effect and indeed the concept of summing them back up to give the original image. If you don't compress the high frequency/low frequency images you would get a perfect image back, so its lossless re-ordering re-structuring of data. By throwing bits away, encoding at lower bitrate you got quite impressive compression rates with good image quality. The filter functions were company secret, the rest was apparently pretty common knowledge.

I never bothered much with the maths, just cared about the end result and applying them to 3D graphics.

Heck I forgot that there were 4 spaces in total, the filters work in different directions. Have a look at this link to see the kind of filtering I was talking about (high/low frequency) :

http://www.cs.colostate.edu/cameron/wavelet.html

K-
 
I doubt this will "translate" well into non-techno-speak, but here goes...

Traditionally signals are recontructed using "Fourier" techniques whereby any signal can be represented by a summation of an infinite number of sinusoidal waves. For transients - non-repeating signals - this is very inefficient. A wavelet is a transient function (delta functions - spikes and step-functions are quite commonly used) that can be used to reconstruct other signals. Wavelets are extremely efficient for recontructing non-repeating signals (and inefficient for recononstructing repetitive signals).

Most "signals" (this could be textures or the time-based amplitude of sound waves, etc.) are processed in "frequency space" using the traditional "Fourier" methods. In this case a function of two dimensions (time and amplitude for example) is processed more efficiently as frequency and wave vector (2d transformed to 2d for processing). One unique attribute of wavelets is that they transform into FEWER dimensions for processing (time and amplitude can transfer simply into a wavelet space of one dimension).

Blah blah. I'm boring even myself.

Mize
 
demalion said:
Hmm...I'm pretty sure we used wavelet theory to analyze for example the behavior of photons and passage through a narrow aperture. Bleh, well, my "been a few years" disclaimer was there for a reason. :-?

EDIT: It srikes me we may be talking for two different applications of the same principles, as your explanation doesn't seem to exclude an application to what I'm thinking of.

I think you're confusing wave theory (from physics) with signal processing (stats and engineering) :LOL:

Cheers,
Darkman
 
Perhaps he confused "wavelets" with "wavicles."
In physics, when you talk about wavelets you could be talking about Huygen's model of light or you could be talking about the use of wavelet theory (wavelet transforms) in the work you're examining, as it crops up a lot (eg. quantum electrodynamics and computing). Easy to get confused if somebody just says "wavelets" at you ;)
 
1) I WAS confusing the application of wave theory with his question about wavelets, quite possibley because we did discuss wavelets it in this context, but again, it has been too long. :-?

2) We did discuss wavelet theory, but I'm not sure of the context anymore. The possible places to have discussed it are rather vast, since in our job we specialized in one aspect, and had to be knowledgeable about the others, which range from understanding how a piece of equipment maintained lubrication oil flow to how a piece of equipment used spectagraphy for chemical analysis (and I mean the principles on which it operated, not just the pieces that made up the apparatus), nevermind nuclear theory. Several years and many tribulations later, perhaps things get a bit mixed up for me. :LOL:

Mea culpa.
 
MPEG-4 also uses wavelet compression for stills IIRC, so I think ATI's Imageon chip probably does the decompression in HW as well. Not sure if this applies to R300 or not (probably not, because in desktop applications you generally have enough CPU horsepower)
 
MPEG-4 and JPEG2000 use the embedded zerotree wavelet algorithm. MPEG-4 specifically requires EZW to be used for texture compression.

I haven't found anything anywhere about ATI doing hardware EZW decompression. Only 3DLabs mentions it in passing which is how this thread started.
 
PeterT said:
The VP10 should be able to do wavelet-based texture compression AFAIR.
Wavelet based texture (de)compression would probably have the same difficulties as, say, JPEG. In the compression phase, after you've done the transform, you'd want to throw away some parts of the information and then also probably use variable length encoding. This makes random decompression of arbitrary sets of pixels a complete pig.
 
Back
Top