Watermarking Motion Data
Motion data, regarded as the time-series of positions and postures of a human body, has been widely used as an essential component in many fields such as contents authoring in computer animation and motion analysis in biomechanics. Thanks to the rapid progress in motion capture (MoCap) technology, now is relatively easier to obtain this data by recording the actions of a real actor.
One of the practical issues concerning motion acquisition by MoCap systems is that it requires expensive hardware and software components, which have to be set up in a large studio. It is highly desirable that the captured or synthesized motion clips are available within a reasonable charge. However, most owners of the captured motion contents are reluctant to publish their documents mainly because of the piracy related risks.
We propose an imperceptible, robust, informed watermarking designed to prove the ownership claims on motion data. Watermarking is a technique of slightly modifying an original document to hide into their structure a secret message related to the contents. here have been proposed quite a few watermarking methods for various media formats, related to computer graphics, like for example sound, image, vector drawing, video, and 3D shapes. We extend the scope of the above applied techniques to motion data. Although we have applied our method only to human motion data, it can be easily generalized to a broader scope.
The imperceptible watermark means that it is difficult todetermine whether a watermark is inserted in certain given motion dataor not. We use the term of a robust watermarking to the extent that this watermark is designed to survive the adversary's removal attempts. The informed watermarking is referring mainly to the fact that the system requires the original data to detect the watermark embedded into the data.
Hiding Copyright Information
We have employed the spread spectrum technique and extended it to motion watermarking. The original data is first transformed into the spectral domain. A set of spectral coefficients for this data is then slightly modulated according to a watermark and chip rate. The partially modified data is then transformed inversely to the original domain, which yields a watermarked data. We can distribute the watermarked data securely, while the original data and watermark must be kept secret in a safe place.
The proposed algorithm has two features which are requirements for character animation synthesis in computer graphics applications. First, the algorithm guarantees that the watermarking does not alter original constraints such as kinematics and floor contacts (see the figure on the right). Second, the watermark is robust against various types of attacks such as cropping, transplanting, free-form deformation, and similarity transformation as well as standard signal processing.
Detecting Copyright Information
In the detection of the watermark, data which we suspect to have been illegally used, are demodulated to extract a suspected watermark. The presence of watermark is claimed based on the statistical correlation between the embedded and extracted watermarks.
Given a suspected motion data, a possible embedded watermark is extracted through the inverse operation consistent to the embedding operation. Before the extraction, two motion data sets, the original and suspected motion data, have to be placed at the same position with respect to their space parameter. We proposed a novel method for estimating the similarity and projective transformation between the motion data for each joint in the articulated figure. Experimental results from various motion capture data demonstrates that the proposed watermarking method is robust against various types of possible attacks, while the fidelity and quality of watermarked motion data remain within acceptable boundaries for the purpose they were captured.
- Shuntaro Yamazaki, “Watermarking Motion Data”, In Proc. Pacific Rim Workshop on Digital Steganography (STEG04), pp.177-185, Nov 2004
- News article, In News Paper Nikkei-Sangyo, Apr 5, 2005 (Japanese)