Package net.paulhertz.pixelaudio
Class WaveSynth
java.lang.Object
net.paulhertz.pixelaudio.WaveSynth
Implements a combination of color organ and additive audio synth.
Animates pixels using phase shifting of audio generators in
waveDataList.
WaveSynth is organized around properties, such as gain (i.e. loudness or brightness)
and gamma (a sort of contrast setting), and data objects. The data objects include a
a bitmap, mapImage, that is a Processing PImage instance for the image representation
of the WaveSynth, a PixelAudioMapper that allows the WaveSynth to mediate between audio
data and image data, arrays for the WaveSynth's audio signal and the image data ordered
by the PixelAudioMapper signal path, and an array of WaveData objects, waveDataList,
that holds the individual sine wave components of the WaveSynth with their frequency,
amplitude, phase, and other properties. There are also a series of properties concerned
with animation and video output.
When a WaveSynth is used to produce color patterns, each WaveData object in the waveDataList
controls a color. The colors of the various WaveData objects are added together, much
as the audio sine waves are to produce audio, with the brightness of each color determined
by the amplitude of the controlling sine wave. The WaveSynthAnimation example code
provides a graphical user interface for editing the WaveSynth properties and the individual
WaveData objects. Experimenting with it to get an idea of what WaveSynth can do to
produce patterns.-
Field Summary
FieldsModifier and TypeFieldDescriptionintfloat[]int[]comments for JSON fileintfloatfloatint[]private intintint(package private) booleanbooleanprocessing.core.PImagefloatthe increment in phase over the image pixels, typically TWO_PI / image sizeintint[]array of color values for mask, especially useful when it is constantfloatfloat[]floatThe sampling frequency, the number of samples read in one second of sound.intintbooleanintprivate intint[]array of colors associated with the WaveData operatorsfloat[]array of amplitudes associated with the WaveData operatorsfloatoffset for normailzing signal, see renderFrame methodfloatscaling facotr for normalizing signal, see renderFrame method -
Constructor Summary
ConstructorsConstructorDescriptionWaveSynth(PixelAudioMapper mapper) WaveSynth(PixelAudioMapper mapper, ArrayList<WaveData> wdList) -
Method Summary
Modifier and TypeMethodDescriptionclone()intfloatgetGain()floatgetGamma()intstatic int[]getHistoBounds(int[] source) intintfloatfloatintgetStep()intgetStop()intintgetWidth()booleanbooleanfloatnoiseAt(int x, int y) static float[]normalize(float[] sig) static float[]normalize(float[] sig, float limit) voidInitializes a list of WaveData for use by a WaveSynth.float[]renderAudio(int frame) float[]renderAudio(int frame, float limit) float[]renderAudioRaw(int frame) voidrenderFrame(int frame) intrenderPixel(int frame, int pos, ArrayList<WaveData> wdList) Render one pixel, return its RGB value.voidsetAnimSteps(int animSteps) voidsetComments(String comments) voidsetGain(float gain) voidsetGamma(float gamma) voidsetHistoHigh(int histoHigh) voidsetHistoLow(int histoLow) voidsetMapper(PixelAudioMapper mapper) voidsetNoiseiness(float noiseiness) voidsetRenderAudio(boolean isRenderAudio) voidsetSampleRate(float newSampleRate) voidsetScaleHisto(boolean isScaleHisto) voidsetStep(int step) voidsetStop(int stop) voidsetVideoFilename(String videoFilename) voidsetVideoFrameRate(int videoFrameRate) voidsetWaveData(ArrayList<WaveData> wdList) voidsetWaveDataList(ArrayList<WaveData> waveDataList) static int[]stretch(int[] source, int low, int high) toString()voidintweightedColor(int[] colors, float[] weights)
-
Field Details
-
mapper
-
mapImage
public processing.core.PImage mapImage -
colorSignal
public int[] colorSignal -
audioSignal
public float[] audioSignal -
renderSignal
public float[] renderSignal -
waveDataList
-
isRenderAudio
boolean isRenderAudio -
w
private int w -
h
private int h -
mapSize
public int mapSize -
dataLength
public int dataLength -
gain
public float gain -
gamma
public float gamma -
gammaTable
public int[] gammaTable -
useGammaTable
public boolean useGammaTable -
isScaleHisto
public boolean isScaleHisto -
histoLow
public int histoLow -
histoHigh
public int histoHigh -
animSteps
public int animSteps -
step
public int step -
stop
public int stop -
noisiness
public float noisiness -
comments
comments for JSON file -
sampleRate
public float sampleRateThe sampling frequency, the number of samples read in one second of sound. By default, for WaveSynth instances that are intended to be primarily visual, mapSize is the sampling frequency. This makes one period of a 1.0 Hz wave fill the entire signal curve. OTOH, if we want the image to represent an audio signal that is also produced by additive synthesis, we should set samplingFrequency to a standard such as 44100 or 48000. -
mapInc
public float mapIncthe increment in phase over the image pixels, typically TWO_PI / image size -
weights
public float[] weightsarray of amplitudes associated with the WaveData operators -
waveColors
public int[] waveColorsarray of colors associated with the WaveData operators -
maskScan
public int[] maskScanarray of color values for mask, especially useful when it is constant -
woff
public float woffoffset for normailzing signal, see renderFrame method -
wscale
public float wscalescaling facotr for normalizing signal, see renderFrame method -
videoFrameRate
public int videoFrameRate -
videoFilename
-
-
Constructor Details
-
WaveSynth
-
WaveSynth
-
-
Method Details
-
setMapper
-
setWaveData
-
updateWaveColors
public void updateWaveColors() -
quickWaveDataList
Initializes a list of WaveData for use by a WaveSynth.- Returns:
- an ArrayList of WaveData objects
-
getWaveDataList
-
setWaveDataList
-
getGain
public float getGain() -
setGain
public void setGain(float gain) -
getGamma
public float getGamma() -
setGamma
public void setGamma(float gamma) -
isScaleHisto
public boolean isScaleHisto() -
setScaleHisto
public void setScaleHisto(boolean isScaleHisto) -
getHistoLow
public int getHistoLow() -
setHistoLow
public void setHistoLow(int histoLow) -
getHistoHigh
public int getHistoHigh() -
setHistoHigh
public void setHistoHigh(int histoHigh) -
getNoiseiness
public float getNoiseiness() -
setNoiseiness
public void setNoiseiness(float noiseiness) -
getAnimSteps
public int getAnimSteps() -
setAnimSteps
public void setAnimSteps(int animSteps) -
getStop
public int getStop() -
setStop
public void setStop(int stop) -
getStep
public int getStep() -
setStep
public void setStep(int step) -
getComments
-
setComments
-
getVideoFrameRate
public int getVideoFrameRate() -
setVideoFrameRate
public void setVideoFrameRate(int videoFrameRate) -
getVideoFilename
-
setVideoFilename
-
getMapper
-
getWidth
public int getWidth() -
getHeight
public int getHeight() -
getSampleRate
public float getSampleRate() -
setSampleRate
public void setSampleRate(float newSampleRate) -
clone
-
toString
-
isRenderAudio
public boolean isRenderAudio() -
setRenderAudio
public void setRenderAudio(boolean isRenderAudio) -
prepareAnimation
public void prepareAnimation() -
renderFrame
public void renderFrame(int frame) -
renderPixel
Render one pixel, return its RGB value. NOTES Our basic equation: ::::: sample amplitude = sin(initial phase + phase shift + frequency * i * (TWO_PI/n)) ::::: Restated, in two parts: wd.phaseInc = (wd.cycles * TWO_PI)/animSteps; mapInc = TWO_PI / mapSize; float val = (float) (Math.sin(wd.phaseTwoPi - frame * wd.phaseInc + wd.freq * freqShift * pos * mapInc) + woff) * wscale + wd.dc; Instead of incrementing phase at each step, we subtract (frame * phase increment) from the initial phase: instead of adding, we subtract so that animation data files give the same result in previous implementations. And yes, I have forgotten the original reasons for subtracting. For the latest version: We now let the WaveData object calculate the signal: this is much more flexible and barely affects the time- Parameters:
frame-pos-wdList-- Returns:
-
weightedColor
public int weightedColor(int[] colors, float[] weights) -
renderAudio
public float[] renderAudio(int frame) -
renderAudioRaw
public float[] renderAudioRaw(int frame) -
renderAudio
public float[] renderAudio(int frame, float limit) -
noiseAt
public float noiseAt(int x, int y) -
normalize
public static float[] normalize(float[] sig, float limit) -
normalize
public static float[] normalize(float[] sig) -
getHistoBounds
public static int[] getHistoBounds(int[] source) -
stretch
public static int[] stretch(int[] source, int low, int high)
-