Primitives for image and video signal processing.
Accept a stream of black-and-white images from input float matrix, save the images to files, and display the resulting files as a moving video sequence. This primitive requires that programs from the Utah Raster Toolkit (URT) be in your path. Although this toolkit is not included with MLDesigner, it is available for free. The user can set the root filename of the displayed images (which probably will be printed in the display window title bar) with the ImageName parameter. If no filename is set, a default will be chosen.
The Save parameter can be set to YES or NO to choose whether the created image files should be saved or deleted. Each image's frame number is appended to the root filename in order to form the images complete filename.
The ByFields parameter can be set to either YES or NO to choose whether the input images should be treated as interlaced fields that make up a frame or as entire frames. If the inputs are fields, then the first field should contain frame lines 1, 3, 5, etc. and the second field should contain lines 0, 2, 4, 6, etc.
Take a float matrix which represents a DCT image, insert "start of block" markers, run-length encode it, and output the modified image.
For the run-length encoding, all values with absolute value less than the Thresh parameter are set to 0.0, to help improve compression. Run-lengths are coded with a "start of run" symbol and then an (integer) run-length.
The HiPri parameter determines the number of DCT coefficients per block that are sent to hiport, the high-priority output. The remainder of the coefficients are sent to loport, the low-priority output.
If the past input is not a float matrix (e.g. dummyMessage), copy the input image unchanged to the diffOut output and send a null field (zero size matrix) of motion vectors to mvHorzOut and mvVertOut outputs. This should usually happen only on the first firing of the primitive.
For all other inputs, perform motion compensation and write the difference frames and motion vector frames to the corresponding outputs.
This primitive can be used as a base class to implement slightly different motion compensation algorithms. For example, synchronization techniques can be added or reduced-search motion compensation can be performed.
For NULL inputs (zero size matrices) on mvHorzIn and/or mvVertIn, copy the diffIn input unchanged to output and discard the pastIn input. (A NULL input usually indicates the first frame of a sequence.)
For non-NULL mvHorzIn and mvVertIn inputs, perform inverse motion compensation and write the result to output.
Accept an input gray image represented by a float matrix, median-filter the image, and send the result to the output. Filter widths of 1, 3, 5 work well. Any length longer than 5 will take a long time to run.
Median filtering is useful for removing impulse-type noise from images. It also smooths out textures, so it is a useful pre-processing step before edge detection. It removes inter-field flicker quite well when displaying single frames from a moving sequence.