Film as a medium is tied to technology, and new tech often changes the way movies are made and seen. Here is a history of film in 11 technical advances.
1906 – Feature Length Films
Cinema was invented in France in the 1890s; by who, depends on who you ask. But the first public screening was held by the Lumiere brothers, in Paris in 1895.
The first films had simple subjects and usually ran for 10 – 15 minutes; the length of one reel of film.
The idea to make a longer film, and so enable more complex storytelling, came from an amateur cinema enthusiast in Melbourne, Australia. William Gibson was a pharmacist who liked movies so much he bought his own projection equipment, and held screenings on the roof of his shop.
This expanded into a side-line, screening films at different venues around the city.
Noting the popularity of a stage version of the life of Australian bushranger Ned Kelly, Gibson proposed a film version of the same. He enlisted Charles Tait, a local live entertainment promoter, as director.
‘The Story of the Kelly Gang’ was shot on location in and around Melbourne, and took six months to complete. Its runtime was 80 minutes.
The film debuted in Melbourne on Boxing Day, 1906, and was an immediate smash; temporary outdoor cinemas eventually had to be erected to handle the crowds. Prints toured the country, and then overseas.
The popularity of ‘Kelly Gang’ inspired other film makers; the next feature length films were produced in Europe, the following year. The medium was off and running.
(Read more about the making of the first feature length film, here).
1908 – Colour
Early films were produced in black and white, but there was an almost immediate push to make the leap to colour.
Georges Méliès, a French illusionist who became one of the first professional film makers, experimented by having his films coloured by hand, using ink and dye. Other film makers also used this technique, but the process was laborious.
In 1906, George Albert Smith, a stage performer and amateur inventor, patented the ‘Kinemacolour’ process. This was a mechanical method to capture the natural colour of what was being filmed.
A red and green filter rotated at high speed in front of the camera while it was filming; a similar filter would then be used when the film was projected, which allowed the images to display in colour (albeit in a limited palette).
Kinemacolour was first used in the British short ‘A Visit to the Seaside’, in 1908 (pictured above). The first colour feature was ‘With Our King and Queen Through India’, a documentary produced in 1912.
Technicolour, a similar technique utilising three filters, was first demonstrated in 1916. This would become the dominant colour process in Hollywood, although it was not perfected until the 1930s.
(You can watch ‘A Visit to the Seaside’ on Youtube, here).
1917 – Animated Features
Disney’s ‘Snow White’ is often cited as the first feature length animated film, but this is not actually correct.
Snow White was made using ‘cel animation’, and while it was the first of that type, other animators got there before Disney, using different techniques.
The first to do so was Quirino Cristiani, an Argentinian political cartoonist and self-taught animator.
In 1916, Cristiani was hired by producer Federico Valle, to make a one minute animated short lampooning the government. Valle produced newsreels, and felt some animated content would distinguish his output, in a competitive market.
The short was popular, and Valle then staked Cristiano in a more ambitious project; an 80 minute animated feature. Released in 1917, ‘El Apóstol’ (The Apostle) poked fun at Argentinian President Hipólito Yrigoyen, showing him with God-like powers, righting the country’s wrongs with lightning bolts before ascending to heaven.
Cristiano used a kind of primitive stop motion for his film: the characters were 2D cardboard models that he manipulated and filmed, one frame at a time. Sadly the one print of El Apóstol was subsequently lost in a fire, only still images remain.
Other animated features appeared through the 1920s, usually deploying stop motion, before Snow White’s revolutionary impact in 1937.
1927 – Talkies
Like colour images, sound had been sought from the earliest days of cinema.
At the first screenings of ‘The Story of the Kelly Gang’ in 1906, actors were deployed behind the screen to read lines of dialogue aloud, and produce sound effects. During the silent film era, live musical accompaniment in the cinema was also common.
It was not until the 1920s that technology allowing synchronised sound recording and playback became available. Western Electric, a New York based firm, was one of several companies working on film audio in this period.
Their ‘Vitaphone’ system (pictured above) utilised a phonograph: sound was recorded live during filming on vinyl records, and then played back in sync with the finished film. This system was simple but effective; after a demonstration in 1925, Warner Brothers bought the company outright.
Two years later they would use Vitaphone to make ‘The Jazz Singer’, a fictionalised biopic of its star, Al Jolson.
While sound was dismissed by some in the industry as a novelty, ‘The Jazz Singer’ was a sensation. It’s dynamic performance at the box office ushered in the era of the ‘talkie’, and consigned silent films (mostly) to history.
1940 – Surround Sound
Sound technology would take a further leap forward a decade later. In 1938, Disney began production on ‘Fantasia’, an ambitious musical anthology that set short, animated episodes to classical music.
Film sound to this point had been delivered in mono: a single audio track, that was the same out of every speaker. For Fantasia, the studio now developed something they called ‘Fantasound’; multi channel audio, that would feed different components of the soundtrack through different speakers.
This would allow complex audio effects for the first time.
As objects moved across the screen, the sound would track across the speakers in sequence, providing an audio sense of movement to match the visuals. In later years, this would come to known as ‘surround sound’.
But in 1940, this was an idea ahead of its time.
Fantasound was not automated; the different speakers and channels (3 of them) were cued by an operator, working off a complicated set of instructions. The required equipment was also bulky, and costly to install.
Fantasia was released to mixed reviews, and was only shown in Fantasound in 13 cinemas. As the film struggled at the box office, the surround sound experiment was curtailed.
1940 – Green Screen
Also in 1940 was the first use of a visual effects technique, often referred to as ‘Green Screen’. Although in its first iteration, the screens in question were actually blue.
Green screen – technically known as ‘Colour Saturation Overlay’– is a method of blending two pieces of film together, and is often used to create imaginative backgrounds.
The actors perform in front of a coloured screen (initially blue, later green). Because the screen is one colour it can be filtered out, the remaining imagery can then be overlaid on another piece of film, which becomes the new background.
This is the technique that allows actors to visit times and places long gone, or that have never existed at all.
Colour Saturation Overlay was developed by RKO Radio Pictures in the 1930s, based on simpler versions that had been used to make title cards. It was first used in the 1940 film ‘The Thief of Baghdad’, to transport the action to the middle east (pictured above).
The film won the Best Special Effects Oscar to mark the achievement.
1973 – Computer Generated Imagery
Another precursor to modern visual effects would come in the early 1970s.
The sci fi film ‘Westworld’ imagined a theme park where patrons experienced fully immersive simulations of history, replete with robots playing era appropriate characters.
In the old west section, a robot cowboy then goes haywire and starts attacking the customers.
The film was written and directed by Michael Crichton, who mined similar ideas later in ‘Jurassic Park’, and he wanted something boldly different to show the robot cowboy’s perspective; computer generated imagery, to create an electronic viewpoint.
But computer technology in the 1970s was primitive, and dedicated visual effects studios were not yet in existence.
Crichton tried his luck with NASA. The Jet Propulsion Laboratory, operator of some of the world’s most powerful computers, said they could create the required visuals, but their quote – $200 000 – was above Crichton’s budget.
Instead, he turned to John Whitney jr, a young experimental film maker. Whitney sourced an optical scanner – then a rare item – and computer from a company called Information International Inc. He scanned footage from the film into the computer, and then digitally manipulated it.
It was a trial and error approach, and painstaking work: the equipment took 8 hours to render 10 seconds of footage. Whitney laboured for nearly a year to complete 2 minutes of film, the CGI inserts were only finished two weeks before ‘Westworld’ was due to open.
CGI was used occasionally through the next two decades, but only really caught on in the 1990s, when more powerful computers were available.
(You can watch the finished Westworld CGI sequence here).
1976 – Steadicam
Garrett Brown got his start in the entertainment industry as a singer, before he turned to advertising. Based in Philadelphia in the 1960s, he produced imaginative ads that won a number of awards.
Brown was fascinated by cinematography, and taught himself to shoot using second hand equipment in a homemade studio. He was technically minded and liked to experiment, modifying his own gear.
In the early 1970s, Brown began tinkering with a system that would allow a camera to remain steady, while it was being moved. To this point, this had only been possible by using a dolly: a camera, mounted on a platform, and moved along a track.
Brown’s method attached the camera to an upright pole, that was connected to the operator via a mechanical arm. The rig deployed gimbals, a type of stabiliser, to isolate the camera from the operator’s movement, keeping it steady.
This set up provided much greater freedom for a cinematographer, and allowed shots that previously had been impossible. The new camera setup was subsequently bought by the Cinema Products Corporation, and named ‘Steadicam’.
Its first cinematic use was in the 1976 Woody Guthrie biopic, ‘Bound for Glory’, released in 1976.
Brown operated the Steadicam himself in collaboration with Haskell Wexler, the legendary DP. It was used to capture scenes of drifters getting in and out of freight trains; Wexler would win an Oscar for the movie the following year.
Brown would subsequently set up an academy, to teach other camera operators how to use his device.
1978 – High Frame Rates
The frame rate of a movie is the number of images per second. In the early days of cinema this was 16, which increased to between 24 and 30 in the 1920s, as film technology improved.
And there it remained. At that rate, the human eye perceives a sequence of still images as movement, and so no further increase seemed needed.
Douglas Trumbull (pictured above) was a visual effects guru, most famous for his pioneering work on ‘2001: A Space Odyssey’.
He was also interested in frame rates: Trumbull felt that 24 frames a second left a viewer with a subconscious impression of the still images they were seeing.
Higher frame rates would remove this ‘flicker’ effect, and provide more fluid motion.
In 1978, Trumbull created a camera and projection system he called ‘Showscan’, which filmed in 70mm at 65 frames per second. He used the technique to make several short films, to showcase the results.
Paramount was impressed, and greenlit a feature length film that would incorporate high frame rate scenes, called ‘Brainstorm’. Cost overruns would later cause the Showscan components to be dropped, and the film was eventually shot at regular speed in 35mm.
Showscan was then used to make novelty films for theme parks and carnivals.
Subsequently, high frame rates would occasionally be used in special effects sequences; the Rancor in ‘Return of the Jedi’ was shot at 90 frames per second, as one example.
But it was not until Peter Jackson experimented with 48 frames per second in his ‘Hobbit’ movies, that the technique arrived in the mainstream (to many viewers dismay). Directors Ang Lee and James Cameron have tried even higher frame rates, how this technology will be used in the future will be interesting to see.
1998 – Digital Cinematography
The first digital camera was a prototype built by Kodak in 1975. Famously, the company did not pursue the technology, preferring to focus on film.
It was not until 1988 that Kodak’s competitor Fuji brought the first digital camera to market. Digital video shortly followed.
The first film to use digital cameras was the low budget drama ‘Windhorse’, directed by Paul Wagner. Set in occupied Tibet, Windhorse tells the story of a Tibetan singer who collaborates with the Chinese authorities to further her career, then has a crisis of conscience when her sister is arrested and tortured.
Wagner’s choice to go digital was pragmatic.
Officially shooting the film in Nepal, Wagner and a small crew entered Tibet and shot footage there illegally, and in secret. The relatively small size of the digital equipment allowed the film crew to pose as tourists.
From this modest start, digital cinematography advanced rapidly. A year later, George Lucas would release his Star Wars prequel, ‘The Phantom Menace’, which had been largely shot in digital.
2008 – IMAX
Christopher Nolan got his start directing twisty, small scale neo-noirs; ‘Following’ and ‘Memento’ were imaginatively scripted and assembled, but made modestly on a low budget.
He entered the Hollywood big time in 2005 with ‘Batman Begins’, and has subsequently pushed the boundaries of cinematic technology.
IMAX is a large scale film format, which debuted in 1970 at an expo in Japan. IMAX film is much larger, and so contains more visual information, than standard 35mm film, providing higher quality images.
The size of the film stock also makes it difficult to use: IMAX cameras are large, cumbersome, noisy and expensive. Until the 21st century they were only used to make IMAX specific movies; often short documentaries, which screened in dedicated cinemas.
For his highly anticipated 2008 Batman sequel, ‘The Dark Knight’, Nolan decided to film selected scenes in IMAX.
Nolan and his DP, Wally Pfister, worked to streamline the large cameras, to make them more practical. They would eventually shoot 6 scenes in full IMAX, totalling 28 minutes, including an armoured car chase which became one of the most famous action scenes in modern cinema.
These scenes were the first ever shot in IMAX for a feature length film.
Nolan has gone on to use IMAX in all his subsequent movies, and other leading film makers have followed his lead. Although the approach remains expensive and difficult to use, and is reserved only for films where the visual aspect is paramount.