Bullet time - Bullet Cameras

- 16.18

joytech03
photo src: www.joytech.in

Bullet time (also known as frozen moment, the big freeze, dead time, flow motion or time slice) is a visual effect or visual impression of detaching the time and space of a camera (or viewer) from that of its visible subject. It is a depth enhanced simulation of variable-speed action and performance found in films, broadcast advertisements, and realtime graphics within video games and other special media. It is characterized both by its extreme transformation of time (slow enough to show normally imperceptible and unfilmable events, such as flying bullets) and space (by way of the ability of the camera angle--the audience's point-of-view--to move around the scene at a normal speed while events are slowed). This is almost impossible with conventional slow motion, as the physical camera would have to move impossibly fast; the concept implies that only a "virtual camera", often illustrated within the confines of a computer-generated environment such as a virtual world or virtual reality, would be capable of "filming" bullet-time types of moments. Technical and historical variations of this effect have been referred to as time slicing, view morphing, temps mort (French: "dead time") and virtual cinematography.

The term "bullet time" is a registered trademark of Warner Bros., formally established in March 2005, in connection with the video game The Matrix Online. The term had first been used within the original script of the 1999 film The Matrix, and later in reference to the slow motion effects in the 2001 video game Max Payne. In the years since the introduction of the term during The Matrix films it has become a commonly applied expression in popular culture.


Bullet Cameras | Camera Systems Boston
photo src: www.camerasystemsboston.com


Maps, Directions, and Place Reviews



History

The technique of using a group of still cameras to freeze motion occurred before the invention of cinema itself. It dates back to the 19th-century experiments by Eadweard Muybridge. In Sallie Gardner at a Gallop (1878), Muybridge analyzed the motion of a galloping horse by using a line of cameras to photograph the animal as it ran past. Eadweard Muybridge used still cameras placed along a racetrack, and each camera was actuated by a taut string stretched across the track; as the horse galloped past, the camera shutters snapped, taking one frame at a time. The original intent was to settle a debate Leland Stanford had engaged in, as to whether all four of the animal's legs would leave the ground when galloping. Muybridge later assembled the pictures into a rudimentary animation, by placing them on a glass disk which he spun in front of a light source. His zoopraxiscope may have been an inspiration for Thomas Edison to explore the idea of motion pictures.

The first widely noticed predecessor of bullet time were Leni Riefenstahl's famous diving sequences from her documentary Olympia of the Olympic Games in 1936. Riefenstahl used a single camera slow motion tracking shot technique to achieve a similar effect.

Muybridge also took photos of actions from many angles at the same instant in time, to study how the human body went up stairs, for example. In effect, however, Muybridge had achieved the aesthetic opposite to modern bullet-time sequences, since his studies lacked the dimensionality of the later developments. A debt may also be owed to MIT professor Doc Edgerton, who, in the 1940s, captured now-iconic photos of bullets using xenon strobe lights to "freeze" motion.

The first application of bullet time was depicted in a scene from the 1962 movie Zotz! where Professor Jonathan Jones uses a magical amulet and shouts the word 'Zotz!' to slow down a speeding bullet.

Bullet-time as a concept was frequently developed in cel animation. One of the earliest examples is the shot at the end of the title sequence for the 1966 Japanese anime series Speed Racer: as Speed leaps from the Mach Five, he freezes in mid-jump, and then the camera does an arc shot from front to sideways.

In 1980, Tim Macmillan started producing pioneering film and later, video, in this field while studying for a BA at the (then named) Bath Academy of Art using 16mm film arranged in a progressing circular arrangement of pinhole cameras. They were the first iteration of the '"Time-Slice" Motion-Picture Array Cameras' which he developed in the early 1990s when still cameras for the array capable of high image quality for broadcast and movie applications became available. In 1997 he founded Time-Slice Films Ltd. (UK). He applied the technique to his artistic practice in a video projection, titled Dead Horse in an ironic reference to Muybridge, that was exhibited at the London Electronic Arts Gallery in 1998 and in 2000 was nominated for the Citibank Prize for photography.

The first music video to use aspects of bullet-time was "Midnight Mover", a 1985 Accept video. In the 1990s, a morphing-based variation on time-slicing was employed by director Michel Gondry and the visual effects company BUF Compagnie in the music video for The Rolling Stones' "Like A Rolling Stone", and in a 1996 Smirnoff commercial the effect was used to depict slow-motion bullets being dodged. Similar time-slice effects were also featured in commercials for The Gap (which was directed by M.Rolston and again produced by BUF), and in feature films such as Lost in Space (1998) and Buffalo '66 (1998). and the television program The Human Body

It is well-established for feature films' action scenes to be depicted using slow-motion footage, for example the gunfights in The Wild Bunch (directed by Sam Peckinpah) and the heroic bloodshed films of John Woo. Subsequently, the 1998 film Blade featured a scene that used computer generated bullets and slow-motion footage to illustrate characters' superhuman bullet-dodging reflexes. The 1999 film The Matrix combined these elements (gunfight action scenes, superhuman bullet-dodging, and time-slice effects), popularizing both the effect and the term "bullet-time". The Matrix's version of the effect was created by John Gaeta and Manex Visual Effects. Rigs of still cameras were set up in patterns determined by simulations, and then shot either simultaneously (producing an effect similar to previous time-slice scenes) or sequentially (which added a temporal element to the effect). Interpolation effects, digital compositing, and computer generated "virtual" scenery were used to improve the fluidity of the apparent camera motion. Gaeta said of The Matrix's use of the effect:

For artistic inspiration for bullet time, I would credit Otomo Katsuhiro, who co-wrote and directed Akira, which definitely blew me away, along with director Michel Gondry. His music videos experimented with a different type of technique called view-morphing and it was just part of the beginning of uncovering the creative approaches toward using still cameras for special effects. Our technique was significantly different because we built it to move around objects that were themselves in motion, and we were also able to create slow-motion events that 'virtual cameras' could move around - rather than the static action in Gondry's music videos with limited camera moves.

Following The Matrix, bullet time and other slow-motion effects were featured as key gameplay mechanics in various video games. Cyclone Studios' Requiem: Avenging Angel, released in March 1999, features slow-motion effects. Remedy Entertainment's 2001 video game Max Payne contains a slow-motion mechanic that allows players to view the paths of bullets, an effect explicitly referred to as "Bullet Time".

Bullet time was used for the first time in a live music environment in October 2009 for Creed's live DVD Creed Live.


Bullet Cameras Video



Technology

The bullet time effect was originally achieved photographically by a set of still cameras surrounding the subject. The cameras are fired sequentially, or all at the same time, depending on the desired effect. Single frames from each camera are then arranged and displayed consecutively to produce an orbiting viewpoint of an action frozen in time or as hyper-slow-motion. This technique suggests the limitless perspectives and variable frame rates possible with a virtual camera. However, if the still array process is done with real cameras, it is often limited to assigned paths.

In The Matrix, the camera path was pre-designed using computer-generated visualizations as a guide. Cameras were arranged, behind a green or blue screen, on a track and aligned through a laser targeting system, forming a complex curve through space. The cameras were then triggered at extremely close intervals, so the action continued to unfold, in extreme slow-motion, while the viewpoint moved. Additionally, the individual frames were scanned for computer processing. Using sophisticated interpolation software, extra frames could be inserted to slow down the action further and improve the fluidity of the movement (especially the frame rate of the images); frames could also be dropped to speed up the action. This approach provides greater flexibility than a purely photographic one. The same effect can also be simulated using pure CGI, motion capture and other approaches.

Bullet Time evolved further through The Matrix series (1999-2003) with the introduction of high-definition computer-generated approaches like Virtual Cinematography and Universal Capture. Universal Capture, a machine vision guided system, was the first ever motion picture deployment of an array of high definition cameras focused on a common human subject(actor, Neo) in order to create volumetric photography. Like the concept of Bullet Time, the subject could be viewed from any angle yet, at the same time, the depth based media could be recomposed as well as spatially integrated within computer generated constructs. It moved past a visual concept of a virtual camera to becoming an actual virtual camera. Virtual elements within the Matrix Trilogy utilized state-of-the-art image-based computer rendering techniques pioneered in Paul Debevec's 1997 film The Campanile [16] and custom evolved for The Matrix by George Borshukov, an early collaborator of Debevec. Inspiration aside, virtual camera methodologies pioneered within the Matrix Trilogy have been often credited as fundamentally contributing to capture approaches required for emergent virtual reality and other immersive experience platforms.

For many years, it has been possible to use computer vision techniques to capture scenes and render images of novel viewpoints sufficient for bullet time type effects. More recently, these have been formalized into what is becoming known as free viewpoint television (FTV). At the time of The Matrix, FTV was not a fully mature technology. FTV is effectively the live action version of bullet time, without the slow motion.

Source of the article : Wikipedia



EmoticonEmoticon

 

Start typing and press Enter to search