In the world of upcoming technologies and innovations it has become hard for traditional techniques to withstand. Same is the case here in terms of 3d animation which is become an integral part of the film industry here for a long time and the motion capture which is upcoming and is here to stay. Motion capture being the favourite of every live action movie director is gaining attention in the film industry.
If you need assistance with writing your essay, our professional essay writing service is here to help!Essay Writing Service
In producing entire feature films with Computer animation, the industry is currently split between studios that use Motion Capture, and studios that do not. Out of the three nominees for the 2006 Academy Award for Best Animated Feature, two of the nominees (“Monster House” and the winner “Happy Feet”) used Motion Capture, and only Pixar’s Cars was animated without Motion Capture. In the ending credits of Pixar’s latest film “Ratatouille,” a stamp appears labelling the film as “100% Pure Animation — No Motion Capture!”
For 3D animations, objects are built on the computer monitor and 3D figures are rigged with a virtual skeleton. Then the limbs, eyes, mouth, clothes, etc.of the figure are moved by the animator on key frames. The differences in appearance between key frames are automatically calculated by the computer. To gain more control of the interpolation, a parameter curve editor is available in the majority of the 3D animation packages. The parameter curve editor shows a graphical representation of the variation of a parameter’s value over time (the animation curve). Altering the shape of the curve results into a change in interpolation and therefore into a change in the speed of motion. By changing the interpolation it is possible to avoid surface interpenetration (such as fingers intersecting each other) when transitioning from one hand shape to the next. The realism of keyframe animations depends largely on the animator’s ability to set believe keyframe (realistic hand shapes) and on his ability to control the interpolation between the keyframe i.e., the speed and fluidity of motion. Rendering takes place in the animation finally.
History of 3D animation
In the year 1824 Peter Roget presented his paper ‘The persistence of vision with regard to moving objects’ to the British Society. In 1831 Dr.Joseph Antoine Plateau (a Belgian scientist) and Dr.Simon Rittrer constructed a machine called a phenakistoscope. This machine produced an illusion of the movement by allowing a viewer to gaze at a rotating disk containing small windows; behind the windows was another disk containing a sequence of images. When the disks were rotated at the correct speed, the synchronization of the windows with the images created an animated effect. Eadweard Muybridge started his photographic gathering of animals in motion. Zoetrope (series of sequential images in a revolving drum) when the drum is revolved the slits in the drum creates the illusion of motion and becomes first movie- similarly film creates this illusion by having one image then black then image then black again. Thaumatrope twirl it and the two images superimpose on each other. Two frame animation.
In 1887 Thomas Edison started his research work into motion pictures. He announced his creation of the kinetoscope which projected a 50ft length of film in approximately 13 seconds. Emile Renynaud in 1892 combining his earlier inventions of the praxinoscope with a projector opens the Theatre Optique in the Musee Grevin. It displays an animation of images painted on long strips of celluloid. Louis and Augustine Lumiere issued a patent for a device called cinematography capable of projecting moving pictures. Thomas Armat designed the vitascope which projected the films of Thomas Edison. This machine had a major influence on all sub-sequent projectors. J.Stuart Blackton made the first animated film which he called “Humorous phases of Funny faces” in 1906. His method was to draw comical faces on a blackboard and film them. He would stop the film, erase one face to draw another, and then film the newly drawn face. The stopmotion provided a starting effect as the facial expressions changed before the viewer’s eyes. Emile Cohl makes En Route the first cut-out animation. This technique saves time by not having to redraw each new cell, only reposition the paper. Winsor McCay produced an animation sequence using his comic strip character “Little Nemo”. John R Bray applies for a patent on numerous techniques for animation. One of the most revolutionary is the process of printing the backgrounds of the animation. In 1914 Winsor McCay produced a cartoon called “Gertie”. The trained Dinosaur” which amazingly consisted of 10,000 drawings.
In 1914 Earl Hurd applies for a patent for the technique of drawing the animated portion of an animation on a clear celluloid sheet and later photographing it with its matching background (Cell animation).
Cell and Paper Animation Technique:
By the mid-1910s animation production in US already dominated by the techniques of cell and paper. Cell animation was more popularized in America than in Europe because of
Assembly line Taylorism that had taken America by storm. Cell Animation was most appropriate to the assembly-line style of manufacturing because it took a whole line of persons working on very specific and simple repetitive duties. On the other hand, in Europe where the assembly-line style of work was not encouraged, clay animation and other forms of animation that required only a few individuals working on the set at a time was more popularized. Because the actual set could only afford a limited amount of individuals working at one time together and no more this style and other alternative forms of animation became more widely accepted. Disney-cell animation – draw each image one at a time using onion-skinning technique.
Traditional cell animation – drawings created one by one animators create the keyframe and assistances create in-betweens; onion skinning process used to make easier the reference drawing of each additional image.
The international feature Syndicate realised many titles including “Silk Hat Harry”, “Bringing up Father” and “Krazy Kat”. In 1923 the first feature-length animated film called “El Apostol” is created in Argentina. 1923 saw the discovery of Disney Brothers Cartoon Studio by Walt and Roy Disney. Walt Disney extended Max Fleischer’s technique of combining live action with cartoon characters in the film “Alice’s Wonderland”. Warner Brothers released “The Jazz Singer” which introduced combined sound and images. Ken Knowlton working at Bell Laboratories started developing computer techniques for producing animated movies. University of Utah, Ed Catmull develops an animation scripting language and creates an animation of a smooth shaded hand. Ref: E.Catmull,”A system for computer generated movies”, Proceedings of the ACM National Conference, 1972. Beier and Neely, at SGI and PDI respectively publish an algorithm where line correspondences guide morphing between 2d images.”Demo” is Michael Jacksons video Black and White.Ref: T.Beier and S.Neely,”Feature-Based image metamorphosis”. Computer Graphics July 1992.
Chen and Williams at the apple publish a paper on view interpolation for 3d walkthoughs.Ref: S.E.Chen and L.Williams,”View Interpolation for image Systhensis”. Computer Graphics Proceeding, Annual Conference Series1993. Jurassic Park uses CG for realistic living creatures. The stars of this movie directed by Steven Spielberg were the realistic looking and moving 3d-dinosaurs, created by Industrial Light and Magic. With each new step into the next generation of computer graphics comes new and more believable CGI characters such as those found in Dinosaur. In Dinosaur the creation and implementation of realistic digital hair on the lemurs is included. After seeing it, George Lucas, director of the Star War series, concluded the time was there to start working on his new Star Wars movies. In his opinion 3d-animation was now advanced enough to believably create the alien worlds and characters he already wanted to make since the early late seventies.
In the year 1995 Toy Story the first full length 3D CG feature film. The first CGI feature-length animation and Pixar’s first feature film. The primary characters are toys in the room of this six-year-old boy Andy, and is mostly told from their point of view. On entrance of computers and 3d driven software feature length films of high polish can be created virtually in 3d. Toy Story is considered to be a first animated feature ever generated completely on computers. Disney and Pixar partnered up to create this film. Star Wars, almost every shot of this movie is enhancing with 3d-animation. It features very realistic 3d-aliens and environment. Lord of the Rings: Two Towers was the first Photorealistic motion captured character for a film; Gollum was also the first digital actor to win an award (BFCA), category created for Best Digital Acting Performance.
Motion capture, motion tracking, or mocap are terms used to describe the process of recording movement and translating that movement onto a digital model. For medical applications and for validation of computer vision and robotics, and it is used in military, entertainment, sports too. To recording actions of human actors, and using that information to animate digital character models in 2d and 3d computer animation is how it is termed in film making. Performance capture is referred when it includes face, fingers and captures subtle expressions. Movements of one or more actors are sampled many times per second, although with most techniques motion capture records only the movements of the actor, not his/her visual appearance, in motion capture sessions. This animation data is mapped to a 3d model so that the model performs the same actions as the actor.
Although there are many different systems for capturing motion capture data, they tend to fall broadly into two different categories:
One contains optical systems, which employ photogrammetry to establish the position of an object on 3D space based on its observed location within the 2d fields of a number of cameras. Data is produced by these systems within 3 degrees freedom from each marker, and rotational information must be inferred from the relative orientation of the sensors with respect to a transmitter. Collecting of motion data from an image without using photogrammetry or magnetic equipment is referred to as motion tracking.
In The Lord of the Rings in 1978, animated film where the visual appearance of the motion of an actor was filmed, then the film used a guide for the frame by frame motion of a hand-drawn animated character; the technique is comparable to the older technique of rotoscope. The camera movements can also be motion captured so that a virtual camera in the scene will pan, tilt, or dolly around the stage driven by a camera operator, while the actor is performing and the motion capture the camera and props as well as the actor’s performance. By doing this, it allows the computer generated characters, images and sets, to have the same perspective as the video images and sets, to have the same perspective as the video images from the camera. The actor’s movements are displayed through the computer process, providing the desired camera positions terms of the objects in the set. Match moving or camera tracking is referred to retroactively obtaining camera movement data from the captured footage.
History of Mocap:
The mocap technology of the modern day has been developed by the led in the medical science, army, and computer generated imagery (CGI) where it is used for a wide variety of purposes. Mocap had successful attempts long before the computer technology had become available.
The invention of zoopraxiscope was because a of a bet of $25,000 on whether all four feet of a horse leave the ground simultaneously or not. Endeared Muybridge (1830-1904) who invented the zoopraxiscope was born in England and became a popular landscape photographer in San Francisco. Muybridge proved the fact that all four feet of a trotting horse simultaneously get off the ground. He did so by capturing a horse’s movement in a sequence of photographs taken with a set of one dozen cameras trigged by the horse’s feet. The earlier motion capture devices are considered to be zoopraxiscope. This technology was perfected by Muybridge himself. His books, Animals in motion (1899) and The Human Figures in Motion (1901) are still used by many artists, such as animators, cartoonists, illustrators, painters as valuable references. Muybridge is a pioneer of a mocap and motion pictures.
In the same year a physiologist and the inventor of a portable sphygmograph was born in France and his name is Etienne – Jules Marey. Sphygmograph is an instrument that records the pulse and blood pressure graphically. Modified versions of his instruments are still used today.
Marey met Muybridge in Paris in the year 1882 and is the following year he invented the chronophotographic gun to record animal locomotion but quickly abandoned it, this invention was inspired by Muybridge’s work. He invented a chronophotographic fixed-plate camera with a timed shutter that allowed him to expose multiple images on a plate in the same year. The camera initially captured images on a glass plate but later he replaced glass plates with film paper, by this way film strips where introduced to the motion picture. Marey’s subject wearing his mocap suit shows striking resemblances to skeletal mocap data in the photographs. Research subjects of Marey included cardiology, experimental physiology, instruments in physiology, and locomotion of humans, animals, birds, and insects. Marey used one camera in motion capture comparing to Muybridge who used multiple cameras.
After the year in which Muybridge and Marey passed away Harold Edgerton was born in Nebraska. In the early 1920’s Edgreton developed his photographic skills as a student while he studied at the University of Nebraska. While working on his masters degree electrical engineering at the Massachusetts in 1926 at the Institute of Technology(MIT), he realized that he couldn’t observe the a part of his motor which is rotating as if the motor were turned off by matching the frequency of the strobe’s flashes to the speed of the motor’s rotation. Stroboscope was developed to freeze fast moving objects and capture them on film by Edgerton in 1913. Edgreton became a pioneer in high-speed photography.
The first successful underwater camera in 1937 was designed by Edgreton and made many trips abroad the research vessel Calypso with French oceanographer Jacques Cousteau. The design and building of deep sea flash electronic equipment in 1954 was done by him. Edgreton passes away in 1990 where his long career as an educator and researcher at MIT.
Max Fleisher and art editor for Popular Science Montly who was born in Vienna in 1883 who moved to the U.S with his family, he came up with an idea of producing animation by tracing live action film frame by frame. Fleisher filmed David his brother, in the year 1915 in a clown costume and they spent almost a year making their first animation using rotoscope. He obtained a patent for rotoscope in 1917.In the year 1918 when World War I ended he produced the first animation in the “Out of the Inkwell” series and he also established Out of the Inkwell,Inc.,which was later renamed as Fleischer Studio. In this series the animation and the live action was mixed and Fleischer himself interacted with animation characters, Koko the clown and Fitz the dog. Before Disney’s “Steamboat Willie,” in the year 1924 that’s 4 years before he had a synchronised soundtrack. Characters such as Popeye and Superman were all animated characters from Fleischer’s studio. Betty Boop first appeared in Fleischer’s animation and later became a comic strip character. In 30’s early animations were filled with sexual humour, ethnic jokes, and gags. When the Hays Production Code (censorship) laws became effective in 1934 it affected Fleischer studio more than other studios. Betty Boop lost her garters and sex appeal as a result.
After almost after 4 years of production Walt Disney presented the first feature length animation, “Snow White and Seven Dwarfs.” “Snow White” was a huge success. The distributer of Fleischer’s animation Paramount pressured Max and David Fleischer to produce feature length animations. The two feature films “Gulliver’s Travel” (1939) and “Mr. Bugs Goes to Town” (1941) were produced by the money borrowed from Paramount. Both of the films were a disaster in the box office. The failure of “Mr. Bug” made Paramount fire the Fleischer brothers and changed the studio’s name from Famous Studios. Max Fleischer sued Paramount over the distribution of his animations. He signed a Betty Boop merchandising deal for King Features, a unit of the Hearst Corporation before he died in the year 1972.
The use of Rotoscoping can be seen in the Disney animations, starting with “Snow White”. Later Disney animations characters were highly stylized and Rotoscoping became a method for studying human and animal motions. Comparison between film footages and the corresponding scenes in the animations reveals skilful and selective use of Rotoscoping by Disney animators. They went above and beyond Rotoscoping. “Snow Whites” can be attributed to Walt Disney’s detailed attention to the plot, character development and artistry.
Both Max Fleischer and Walt Disney were highly innovative individuals; however, it is said true that “Disney’s memory belongs to the public; Max’s to those who remember him by choice” (Herald son, 1975).
Beginning of Digital Mocap:
In the 1970’s the research and development of digital mocap technology started in pursuit of medical and military applications. In 1980’s CGI industry discovered the technology’s potentials. In the 1980’s there were floppy disks that were actually floppy and most computers were equipped with monochrome monitors; some with calligraphic displays. To view color images, for example rendered animation frames, images had to be sent to a “frame buffer,” which was often shared by multiple users due to its cost. Large computers were housed in ice cold server rooms. Offices were files with the noise of dot matrix printers. In the 1980’s ray tracing and radiocity algorithms were published. Based on these algorithms renderers required a supercomputer or workstations to render animations frames in a reasonable amount of time. Personnel computers weren’t powerful enough. CPU’s, memories, storage devices, and applications were more expensive than today. Wavefront technologies developed and marketed the first commercial of the shelf 3D computer animation software in 1985. At that time only a handful of animation production companies existed. Most of the animations that they produced were “flying logos” for TV commercials or TV programme’s opening sequences. The pieces were 15 to 30 seconds long. In the 1980’s the readers who saw “Brilliance” probably still remember the astonishment of seeing a computer generated character, a shiny female robot, moving like a real human being.
“Brilliance” being the first successful application of mocap technology in CGI,”Total Recall” was the first failed attempt to use mocap in a feature film. The post production companies contracted to produce effects for the 1990 science fiction film starring Arnold Schwarzenegger and Sharon Stone, Metrolight Studio being one of them. Metrolight decided to use mocap to create an animation sequence of moving skeletons for the scene in which Schwarzenegger’s character goes through a large airport security X-ray machine, along with other people and a dog. Operator from an optical mocap equipment company was sent out to a location with mocap system. A team from metrolight followed the operator’s instruction while capturing performances by Schwarzenegger and other performers. They went home believing that the capture session had gone well and the mocap company would deliver the mocap data after cleaning and processing. What so ever metrolight never received usable data and had to give up using mocap for the scene.
Metrolight’s unfortunate experience teaches us one lesson that we should hire only a service provider with a good track record and references.
In 1995 FX Fighters released its first real-time fighting with 3D characters in 3D environments. It’s also one of the first video games that used mocap technology to give realism to 3D characters movements. By the user input using a set of motion captured actions, game characters are animated in real time. The pieces of actions are played in such a way that the player does not notice the transition from one action to another giving an impression that the player is fully in control of a game character’s movement. Seeing the success of the game, other game companies were encouraged to use mocap in their games.
In the 1980s and 1990s these pioneering efforts have shown remarkable development and achievement in digital mocap. In the recent years, in addition to medicine, and entertainment, mocap applications have been found in many other fields. Mocap is used by various sports to analyze and enhance the athlete’s performances and prevent injuries. Designers use mocap to understand users movements, constrains, and interactions with environments and to design better products. Mocap is used by engineers to analyze human movements and design robots that walk like us. Mocap is also used by art historians and educators to archive and study performances by dancers and actors. For instance, in 1991 an intricate performance by legendary French mime Marcel Marceau (1923-2007) was captured at the Ohio State University to preserve his arts for future generations.
3D ANIMATION PRODUCTION PIPELINE
Convincing the big jobs to work on the story.
Story plot solid summary
What the films about, what happens in it and extra variations that may or may not appear in the final product.
Basic sketches of the scenes.
(Time usually taken = 6 months)
At first the artists themselves do the voice acting to put a connection from the story board to the script to give an idea of the film, later on celebrities are paid to be the character voices.
Pictures in a timescale with voice recordings playing in conjunction, basically a really basic film.
Artists try to create the look and feel of the scenery and the characters from the scripts, voice talent and the basic drawings, the artists get first crack at how lighting sets the mood too
The characters, props and landscape have started to be created in 3d; hinges have been added to them to give them movement. Everything is still in frame form, no textures have been added yet (think skeletons).
The models and props are skinned according to the mood and feel the team wants for the film to portray.
The Basically skinned objects and characters are set into positions to work out camera angles and movement, nothing is truly animated or skinned yet, the recordings of these final cuts are passed onto the animation team.
(Time Usually taken = 4 weeks)
The models are animated, everything such as the skeleton is already there so they are basically choreographers (think puppeteers). They move the mouth and ligaments according to the sounds and the scripts.
(Time usually taken 4 weeks)
shading changes surfaces according to the lighting on it, it affects the model’s colour depending on the lighting situation e.g. light bouncing off a shiny metal surface is successfully done thanks to a shader. Shaders are added to the landscapes, models and props.
Lighting is added to the scenes, Lighting is what actually makes everything look great. Lighting is based on the mood scripts.
(Time usually taken = 8 weeks)
The final product is rendered; this can take a hell of a lot of time to render one frame depending on the quality of the graphics put into.
Things such as music scores, special effects and sound effects are added, the film is also recorded to an appropriate format.
MOTION CAPTURE PRODUCTION PIPELINE
Storyboard development & Shot analysis
It is important to work out exactly what action is needed at this stage, plus any restrictions which may impede the actor. There are several factors which need to be addressed:
Does the actor’s size correspond to that of the character.
Should the actor have any props, or costume (for example having the actor where horns for your demon character in your mocap session, will prevent the arms going through the horns at the implementing stage) The spatial surrounding should be a factor.
Will the motion need to be blended (e.g. A running motion, as the motion capture studio will only capture a fragment of the run).
Develop a character rig, which involves the following:
Matching the actor’s size as much as possible.
Constraining the joints.
Problems may include exporting out of your animation package into the correct format (e.g. .xsi into fbx) Several different export formats should be tested to realize which suites best with the character rig (e.g. .bvh, .fbx, etc).
Actual Motion Captured
This can be viewed on a rig in real time. There are several different forms of Motion Capture devices. The most commonly used are:
Mechanical, Optical, and Electromagnetic (magnetic)
This involves several data manipulators being applied to the motion capture data. In optical motion capture systems, for example, after you capture the movements of your actors, the data is stored as raw 2D data. ‘Reconstruction’ process will convert it into continuous 3D trajectories. ‘Label’ process will label all the trajectories and so on. Additional processing may be needed when there are data gaps, jitters and other data-noises.
This is simply the process of applying your data to your skeleton rig provided at the initial stages. There can be several problems at this stage depending on the formats and animation package chosen. For example there is an issue with UVs, materials, scaling etc. It is suggested you follow each package pipeline to minimize these issues.
APPLICATIONS OF MOTION CAPTURE
The process of recording movement and translating that movement onto a digital model is called as motion capture, motion tracking or mocap. Its applications are used in the military, entertainment, sports, medical applications and for validation of computer vision and robotics etc.
The largest market for motion capture is game development. Games are drawing as much revenue as movies; it is easy to see why game development often calls for enormous quantities of motion capture. There are basically two types of 3d character animation used in games: real time playback vs. Cinmeatics. Real-time allows the game player to choose from pre-created moves, by controlling the character’s moves in real-time. Cinmeatics are the fully rendered ‘movies’ used for the intros and ‘cut-scenes’. Often the last part of game production, or a process that is sub-contracted to a separate studio,cinematics are generally not essential to game-play, but do add a lot of appeal to the game, and help immensely with story development and mood generation.
Video and TV
In live television broadcasts real-time motion is becoming popular. Using motion capture we can place a virtual character within a real scene, or to place live actors within a virtual scene with virtual actors, or virtual characters with a virtual scene.
For real time broadcasting mocap requires mocap-up of any non-standard physiology to keep the performers motion from causing the character’s limbs to interpenetrate its body. Joints limits on the shoulders and knees also help maintain believability of the character. A real-time adaptation feature such as motion builder’s real-time motion mapping is essential when the character’ body is very different from the actors body. While combining live elements with virtual elements the real and virtual cameras must share the same properties otherwise the illusion looks strange.
Producing daily 3d animated features becomes easy with use of the phasespace optical motion capture system combined with motionbuilder.,allowing TV stations to keep their content fresh and exiciting,and giving viewers yet another reason not to touch that dial.
Post-Production for ongoing series
using motion capture for ongoing series is gaining popularity. The result of creating
a weekly show without motion capture invariably causes shows to be late or production studios to go bankrupt. Having an efficient motion capture pipeline is essential to the success of an ongoing character animation based series.
The use of motion capture in the films is increasing day by day. For creating character based animation motion capture is essential that move realistically, in situations that would be impractical or too dangerous for real actors.eg. Titanic were characters falling down off the ship. Motion capture was used extensively in Titanic for filler characters. Many of these shots would have been difficult or impossible to do with real cameras and a real ship, or real models, so virtual models, actors, and cameras were used. Some film characters require the use of motion capture, otherwise their animation seems fake. More and more independent companies are starting to put together desktop studios-the idea of two or three people creating an entire movie are not far off, if motion capture is used correctly. Motion capture animation can be done very quickly and inexpensively, without scheduling expensive motion capture sessions in a studio.
Motion capture is ideal for the web, whether used to create virtual hosts or greeting cards. Motion capture brings a human element to the web as the web becomes more sophisticated and bandwidth increases, in the form of characters that viewers can relate to interact with.
Motion capture generated performance animation can be thought of as ‘Improvisation meets Computer Graphics (CG)’. A good improviser acting through a CG character in real-time can create a very intriguing lass sting experience for the viewer at trade shows, meetings or press conferences. Integrating with live actors father helps create a fascinating experience.
While doing perceptual research motion capture is useful. By presenting test subjects with abstract movements, distilled from motion capture data, repeatable experiments can be developed that provide insights into human perception.
Motion capture is relied by biomechanical analysis for rehabilitation purposes. Motion capture can be used to measure the extent of a client’s disability as well as a client’s progress with rehabilitation. Motion capture can also help in effective design of prosthetic devices.
For producing ergonomically practical product designs motion capture is essential, as well as designs for physical products that are comfortable and appealing. When it comes to working in an enclosed space, the gypsy has tremendous advantages over optical or magnetic systems, such as a car inter
Cite This Work
To export a reference to this article please select a referencing stye below:
Related ServicesView all
DMCA / Removal Request
If you are the original writer of this essay and no longer wish to have your work published on UKEssays.com then please: